Evaluation Methods Sample Clauses

Evaluation Methods. The Association may present to the Board of Trustees its views on an evaluation instrument and/or propose an evaluation instrument. The Board of Trustees will make the final selection of the evaluation instrument. The School District will maintain a uniform evaluation instrument for all teachers or groups of teachers or activities. The School District will seek the input of all the individual teachers affected by an evaluation instrument and/or its application before implementing or changing the same.
AutoNDA by SimpleDocs
Evaluation Methods. All monitoring or observation of work performance of an Employee for purposes of a formal evaluation will be conducted openly and with full knowledge of the Employee. Covert methods will not be used for formal or informal observation of teaching performance.
Evaluation Methods. The CO/COR will conduct performance evaluations based upon Section II above and the required performance levels set forth in the contract and/or Task Orders. The following techniques will be used to perform surveillance:
Evaluation Methods. Nicor Gas will work with the independent third party evaluator and SAG to develop best practices for when program evaluations shall be controlled trials or quasi-experimental design methods. Until SAG develops best practices, where appropriate given the EM&V budget constraints, the independent evaluator shall give preference to randomized controlled trials or quasi-experimental design methods. When a program evaluator believes that randomized control trials or quasi- experimental designs are not appropriate given the EM&V budget constraints, the program evaluator shall provide an explanation and support for its decision as part of its evaluation plan.
Evaluation Methods. The closeness of fit of the proposed deep brain-tuned transformation with the MNI brain was compared with that of the untransformed brain and a surface-tuned transformation using the fol- lowing four numerical measures. The first measured the percentage of lateral ventricle coordinates correctly falling within the MNI lateral ventricle; a higher proportion indicates better agreement. The second measure was the mean distance in mm between the MNI lateral ventricle coordinates and each of the three sets of coordinates being compared; a smaller number indicates better agree- ment. The last two measures were the 5th and 95th percentiles of individual distances in mm used to calculate the mean distance; a smaller numerical value indicates better agreement. Numerical stability of the final estimate of Θ was investigated by using several different initial starting values for Θˆ in the numerical minimization routine. Ultimately, an initial starting value of (1, 1, 1, 1, 0) was used as the initial starting value of Θˆ to obtain the final estimate of Θ re- ported above. This vector of initial values corresponds to no scaling as indicated by the first three ones in the vector and no rotation as indicated by the last one and the zero in the vector. One should note the final one in the vector can be replaced by any non-zero number a and still corre- . 0 Σ spond to no rotation since arctan a = 0 provided a ƒ= 0. More general affine transformations with as many as nine parameters were attempted with little to no reduction in the mean surface distance. 3 Results Convergence using the initial starting vector in section 2.2 took 47 iterations with a final mean distance of 2.468 mm compared with a starting mean distance of 2.823 mm. The final numerical value of Θˆ is approximately (1.039, 0.939, 1.261, −38.808, 0.251). Recalling equation (1), the first three parameter estimates show slight scaling in the x and y directions and greater scaling in the z direction. The last two estimated parameters result in a rotation about (0, −38.808, 0.251) of approximately −0.00633 radians or −0.363 degrees. When several different starting values were used to obtain an estimate of Θ, all reasonable starting values yielded estimates of Θ which agreed to within 3 significant digits of the estimate given above. Thus, the final reported estimate appears to be numerically stable. For ease of use, the simplified transformation obtained by substituting Θˆ into equations (2) and (3) is:     xj...
Evaluation Methods. We applied a cross-sectional design with mixed methods approaches in this evaluation. We utilized both qualitative and quantitative evaluation methods. This section describes how we designed and conducted the mid-term evaluation, which covered the time period October 2018 to July 2019. Key staff and research assistants were trained on human subjects’ participation, protocol-specific activities and data abstraction, interviewing and data entry.
Evaluation Methods. The scientific dimension can be evaluated by number of articles (and impact), conferences, given presentations, students trained through a project. • Questionnaires and evaluation surveys are widely accepted but sometimes fail for several reasons: o people do not want to use their time to answer them; o some institutions have their own standard feedback mechanisms so it is difficult to do introduce modifications or propose new ones; o sometimes standard questionnaires might prevent facilitators from asking the right questions. • The number of people that has attended a workshop or has been reached by social networks is a very weak measurement to evaluate actual success or impact. “A deep change in one thousand might be better to one million retweets.” Xxxx • Demographics are necessary to see if people without science training have been reached. • Qualitative evaluation is advisable to find out what participants think. o It is a good practice to train oneself in observing and listening to participants. o Interviews are a fruitful way to get deeper insights. • Find useful ways to measure learning (knowledge), changing attitudes and conceptions. Sometimes a simple drawing explains more than a five-page questionnaire. • Longitudinal studies to see if activities change the future behaviour of participants are very advisable. For example: “students involved in citizen science are more likely to include citizen science methods if they become researchers later in their life”. • Sharing a coffee after an activity in another informal place allows facilitators to gather relevant information out of the record.
AutoNDA by SimpleDocs
Evaluation Methods. The SSRP/A Local NGO Grants Component was evaluated using site interviews with the Local Partner NGO (LP) key staff and other relevant stakeholders (when available), and review of Mercy Corps / ORT indicators and other documentation. Mercy Corps / ORT Local Grants Department staff provided information about the programs and performance of each of the LPs, which was then used to select the evaluation sample. Five LPs were chosen for the evaluation, given the unique cases that each represented. The reason for this type of selection was to show a wide range of different models or types of impacts represented by the SSRP/A LPs. The site interview outline (see Appendix) included the following categories: Community involvement in rehab and other activities, Structures for community involvement used or created, Impact of Mercy Corps / ORT technical assistance and training, Relationship of organization to local and other authorities (rehab cases only), Follow-up and / or spin-off projects, and questions related to the individual program and its results. The LPs and reasons for their selection is summarized in the table below: LP name Reason for selection1
Evaluation Methods. Describe how your progress regarding each objective will be measured. How will you know, and show, others that you have achieved your learning objectives? Describe your final activities/projects for each objective and/or strategy. What are the “deliverables” (e.g. final reports, project briefings, project implementations, etc.). Who receives them? Will you compile records of your activities or the outcomes of your activities throughout the internship? Will the project you’re working on be implemented or incorporated into something larger? LEARNING AGENDA Learning Objectives What I intend to learn. Be specific! Minimum 5 Strategies What I will do/how I will learn it. Be creative! Minimum 7 Evaluation Methods How will I know if I’ve achieved it? Minimum 3 APPROVALS Internship Supervisor I have discussed this internship agreement with the intern and agree to each of the following: • The employer and information sections (p. 1) of this agreement are accurate. • I will assign work to the intern that supports the spirit and purpose of the learning agenda. • I will provide the intern with an orientation to relevant organizational arrangements, procedures, and functions. • I will meet with the intern regularly and make myself available for counsel and advice. • I will provide feedback on the intern’s performance to him/her and through an online survey upon completion of the internship. • I understand that I am encouraged to contact the course instructor should any questions or concerns arise.
Evaluation Methods. The student and faculty sponsor and field supervisor must have a clear understanding of how each completed objective will be evaluated and a grade determined. Exams, essays, research papers, reports, self-studies, demonstrations, presentations, job diaries, software or computer programs, creative projects and other methods can be used to document the learning accomplished.
Time is Money Join Law Insider Premium to draft better contracts faster.