Evaluation Design Sample Clauses

Evaluation Design. USD 489 will utilize the e4E Evaluation Tool designed by Southwest Plains Regional Service Center (SWPRSC) for teacher evaluation. *See forms in Appendix.
AutoNDA by SimpleDocs
Evaluation Design. The Evaluation Design shall include the following core components to be approved by CMS:
Evaluation Design. The evaluation design will utilize a post-only assessment with a comparison group. The timeframe for the post-only period will begin when the current demonstration period begins, and ends when the current demonstration period ends.
Evaluation Design. The evaluation design will be based on a mixed methods approach. Both qualitative and quantitative methods will be used to address evaluation questions. The CDC’s updated guidelines for evaluating surveillance systems will be used to determine the efficacy of the surveillance system. The following five evaluation questions will be addressed:
Evaluation Design. The State must submit to CMS for approval a draft evaluation design no later than January 1, 2007. At a minimum, the draft design must include a discussion of the goals, objectives, and evaluation questions specific to the purposes of and expenditures made by the State for its health care reform activities. The draft design must discuss the outcome measures that will be used in evaluating the impact of these activities on the efficient operation of the State’s health care system during the period of the Demonstration. The outcome measures below represent agreed-upon metrics under which the State and CMS can measure the shared financial benefit of the health care reforms and must be included in the evaluation design: • Nursing home admissions - “Value of Averted Medicaid Nursing Home Admissions”: For each fiscal year under the demonstration, the number of the reduction in the number of Demonstration Year (DY) Medicaid bed-days below Base Year (BY) level * average cost per bed-day * DY Medicaid enrollees.
Evaluation Design. Provide information on how the evaluation will be designed. For example, will the evaluation utilize a pre/post comparison? A post-only assessment? Will a comparison group be included?
Evaluation Design. Exploratory cross sectional study design was used in this study. The design was selected because the study wanted to explore the phenomena under study and it was cross sectional because data about the phenomena were obtained at one moment in time.
AutoNDA by SimpleDocs
Evaluation Design. The overall evaluation design involves randomly assigning youth to one of two different conditions: treatment (special services) or control (usual care). This design is referred to as a randomized control trial (RCT). The treatment group will receive a variety of evidence-based services above and beyond what they would otherwise be provided. The control group will receive the services traditionally provided by the system. Pre-randomization Data Collection and Transmittal Random Assignment Database Each youth who is deemed eligible for participation will be randomized to either the treatment or control. After eligibility has been established, the Intake Worker of the [Primary Service Provider] will log into a secure web application, referred hereafter as the “Random Assignment Database,” using a pre-assigned username and password. The Random Assignment Database is used for randomly assigning youth and creating an electronic record of the random assignment outcome. After logging into the system, the Intake Worker enters the county-issued Youth Identification Number (YIN) for the youth to be randomized. The Intake Worker then enters the YIN in a separate field, and both entries are required to match in order to help minimize errors in data entry. The Intake Worker will be prompted that the YIN must match in order to proceed. If the numbers match, the Intake Worker then selects the county from where the youth was referred from a drop-down menu of pre-defined sites. After selecting the referring county, the Intake Worker certifies that all the information is correct by checking a box and then formally submits the information. After the information is submitted, the Intake Worker is immediately shown a screen that provides the random assignment outcome (i.e., treatment or control) for that specific youth. After the Intake Worker clicks enter, all the data from the web application session is saved in an underlying database (described in the following section). The Intake Worker is presented with a new screen that indicates whether the youth is assigned to special services or control. The database automatically generates an e-mail record of the randomization outcome, referring county, site ratio (refer to Randomization Procedures), and the data and time when the outcome was generated. The underlying database stores a comprehensive record of the user session, which is described in Appendix B. Random Number Methodology The randomization function inside the applicati...
Evaluation Design. This evaluation employs post-only analyses. Because the FPW program was initiated over 20 years ago, a pre-post approach is not ideal. Because the majority of women eligible for the FPW program do not enroll in a given year, this creates an opportunity for a relevant comparison group for several of the evaluation questions. Thus, this will be a post-only analysis with a comparison group where outcomes for FPW enrollees will be compared to outcomes for a control group which will consist of women eligible for FPW but that do not enroll in the program. The qualitative design is discussed in the context of a specific research questions in “Analytic Methods” below.
Evaluation Design. The draft design must discuss the outcome measures that shall be used in evaluating the impact of the Demonstration during the period of approval. It shall discuss the data sources, including the use of Medicaid encounter data, and sampling methodology for assessing these outcomes. The draft evaluation design must include a detailed analysis plan that describes how the effects of the Demonstration shall be isolated from other initiatives occurring in the State. The evaluation designs proposed for each question may include analysis at the beneficiary, provider, and aggregate program level, as appropriate, and include population stratifications to the extent feasible, for further depth and to glean potential non-equivalent effects on different sub-groups. The draft design shall identify whether the State will conduct the evaluation, or select an outside contractor for the evaluation.
Time is Money Join Law Insider Premium to draft better contracts faster.