Scientific Evaluation Sample Clauses

Scientific Evaluation. Each Managing Authority will follow their own scientific evaluation process as defined in the National Rules & Regulations. As a result of the scientific evaluation processes conducted by the Managing Authorities, they will share the evaluation results with each other. The final decision is taken by the Joint Committee that has equal representation of experts from both Managing Authorities. The Joint Committee will choose to fund the projects among the ones that have been evaluated as the most successful projects by both Managing Authorities’ evaluation processes, depending on the following call criteria: - Scientific / Technological Excellence - Methodology - Project Management - Importance of the international collaboration - Impact
AutoNDA by SimpleDocs
Scientific Evaluation. The quality of the reports is determined by the Technical Committee, or experts commissioned by the latter, in accordance with the Xxxxxxxx et al.1 method by classifying the report into one of the following categories: (1) reliable without restriction, (2) reliable with restrictions, (3) not reliable, (4) not assignable. The allocation to the four categories must be accompanied by appropriate substantiation in accordance with the requirements described in the chapter "Documentation of reliability categories in data sheets (IUCLID)" of the Xxxxxxxx et al. publication. The quality of the robust summaries and IUCLID datasets is determined by the IP Consortium, or experts commissioned by the latter. If the documents (IUCLID data set and/or robust summary) submitted by a party supplying a report are not in conformity with the state of the art or missing, the IP consortium or experts commissioned by the latter, should develop a robust summary and an IUCLID update. Also studies, for which no standard protocol exists, e.g., exposure studies, must be documented by an IUCLID data set and a robust summary, and are also to be evaluated under the Xxxxxxxx et al. method.
Scientific Evaluation. The proposals that are found administratively eligible will be scientifically evaluated by three Turkish- and three Maltese-nominated external reviewers. Following the scientific evaluation by external reviewers, TÜBİTAK and MCST will exchange the Project Proposal Evaluation Forms for perusal. The Managing Authorities will make their decisions by considering the evaluations of the external reviewers from both countries. TÜBİTAK and MCST will select the project proposals to be funded during a joint committee meeting. Only the project proposals which are approved by both TÜBİTAK and MCST will be supported within available budget. The intent of this 4th Call is to award five high-ranking projects, ideally including at least one PRIMA-related project.
Scientific Evaluation. For reports, which are contributed by individual Members, the supplier provides the Consortium with the report itself and available summaries in the form of an IUCLID data set and a robust summary. The robust summary may also be integrated into the IUCLID data set. The quality of the reports is determined by the Technical Committee, or experts commissioned by the latter, in accordance with the Xxxxxxxx et al.1 method by classifying the report into one of the following categories: (1) reliable without restriction, (2) reliable with restrictions, (3) not reliable, (4) not assignable. The allocation to the four categories must be accompanied by appropriate substantiation in accordance with the requirements described in the chapter "Documentation of reliability categories in data sheets (IUCLID)" of the Xxxxxxxx et al. publication. The quality of the robust summaries and IUCLID datasets is determined by the Technical Committee, or experts commissioned by the latter. If the documents (IUCLID data set and/or robust summary) submitted by a party supplying a report are not in conformity with the state of the art or missing, the Technical Committee or experts commissioned by the latter, should develop a robust summary and an IUCLID update. Also studies, for which no standard protocol exists, e.g., exposure studies, must be documented by an IUCLID data set and a robust summary, and are also to be evaluated under the Xxxxxxxx et al. method.
Scientific Evaluation. 1.1. For reports, which are contributed by individual members of the consortium, the supplier provides the consortium with the report itself and existing and available summaries in the form of an IUCLID data set and a robust summary. The robust summary may also be integrated into the IUCLID data set.
Scientific Evaluation. A member of the Scientific Liaison Panel (SLP), expert on the respective proposal topic, is allocated to each proposal. This SLP member accompanies the proposals and he/she is responsible throughout the different steps of the evaluation process and – if the proposal is successful – even afterwards for the cruise reporting. One member of the Advisory Board (AB) participates at the SLP meetings ensuring the transparency of the evaluation process. The ARICE Evaluation Office maintains a list of expert evaluators to assist in the evaluation of all proposals for funding. The names of the experts assigned to individual proposals are not made public. Evaluators are required to read and sign a Declaration of Confidentiality and Conflict of Interest Form. Proposals meeting the eligibility criteria are evaluated based on their individual merit by as a general rule three individual evaluators. Evaluators are chosen in mutual agreement by the Scientific Liaison Panel and the Evaluation Office. The experts examine the proposal(s) assigned to them and score and comment on each proposal under each of the Evaluation Criteria (see below) using an individual Proposal Assessment Form.
Scientific Evaluation. To prioritise the principle of caution. “Before launching a research project, an evaluation of the impacts should be performed beforehand. (…) [A good practice is] the evaluation performed by the scientist of UN, the IPCC [International Panel on Climatic Change]. (…) Unfortunately, all the evaluations I know are a posteriori and not a priori. [Another good practice] is the EU’s elaboration of the RICH legislation on the evaluation of cancer-related illnesses. There are many products with unknown impacts on health and the environment. Thanks to that evaluation, a new regulation will be established for controlling these types of products.” (NSCI) - To learn from good practices. “To systematize good practices, to combine ex ante and ex post evaluation to professionalize evaluation…” (MAN) “[A good practice is] the independent evaluation of the CSIC when elaborating its Planning of Activities for the period 2006-2009.” (PBMAN) - Transparency and decentralisation. “We often do not know who we are evaluating, their curricula, who has chosen them, which criteria they are using… I think transparency and decentralization are crucial.” (SSCI) - Internationalisation of evaluation processes. “To externalize and internationalize evaluation processes.” (PBMAN) - To apply the consequences of evaluation: rewards and punishments. “Measurable indicators should be implemented. (…) And sometimes drastic decisions must be adopted [for example, to fire professors or researchers].” (PVMAN) “For me it is important to have an evaluation that recognises the excellence of scientific work, helping those who are doing it better more. (…) What is well done should be rewarded, and what is not well done should not.”
AutoNDA by SimpleDocs
Scientific Evaluation. The Parties will proceed, independently, to the received applications’ scientific assessment, which, for CNR, will be performed by an Expert Committee, whose appointment is published on CNR institutional website, xxxxx://xxx.xxx.xx/it/progetti-comuni-ricerca. For the Tunisian part, it will be performed by an Expert Committee designed by the MHESR. The Committee will select the projects worth of financing by following the below-mentioned criteria:

Related to Scientific Evaluation

  • JOC EVALUATION If any materials being utilized for a project cannot be found in the RS Means Price Book, this question is what is the markup percentage on those materials? When answering this question please insert the number that represents your percentage of proposed markup. Example: if you are proposing a 30 percent markup, please insert the number "30". Remember that this is a ceiling markup. You may markup a lesser percentage to the TIPS Member customer when pricing the project, but not a greater percentage. EXAMPLE: You need special materials that are not in the RS Means Unit Price Book for a project. You would buy the materials and xxxx them up to the TIPS Member customer by the percentage you propose in this question. If the materials cost you, the contractor, $100 and you proposed a markup on this question for the material of 30 percent, then you would charge the TIPS Member customer $130 for the materials. No response TIPS/ESC Region 8 is required by Texas Government Code § 791 to be compensated for its work and thus, failure to agree shall render your response void and it will not be considered. Yes - No Vendor agrees to remit to TIPS the required administration fee or, if resellers are named, guarantee the fee remittance by or for the reseller named by the vendor?

  • TECHNICAL EVALUATION (a) Detailed technical evaluation shall be carried out by Purchase Committee pursuant to conditions in the tender document to determine the substantial responsiveness of each tender. For this clause, the substantially responsive bid is one that conforms to all the eligibility and terms and condition of the tender without any material deviation. The Institute’s determination of bid’s responsiveness is to be based on the contents of the bid itself without recourse to extrinsic evidence. The Institute shall evaluate the technical bids also to determine whether they are complete, whether required sureties have been furnished, whether the documents have been properly signed and whether the bids are in order.

  • Job Evaluation The work of the provincial job evaluation steering committee (the JE Committee) will continue during the term of this Framework Agreement. The objectives of the JE Committee are as follows: • Review the results of the phase one and phase two pilots and outcomes of the committee work. Address any anomalies identified with the JE tool, process, or benchmarks. • Rate the provincial benchmarks and create a job hierarchy for the provincial benchmarks. • Gather data from all school districts and match existing job descriptions to the provincial benchmarks. • Identify the job hierarchy for local job descriptions for all school districts. • Compare the local job hierarchy to the benchmark-matched hierarchy. • Develop a methodology to convert points to pay bands - The confirmed method must be supported by current compensation best practices. • Identify training requirements to support implementation of the JE plan and develop training resources as required. Once the objectives outlined above are completed, the JE Committee will mutually determine whether a local, regional or provincial approach to the steps outlined above is appropriate. It is recognized that the work of the committee is technical, complicated, lengthy and onerous. To accomplish the objectives, the parties agree that existing JE funds can be accessed by the JE committee to engage consultant(s) to complete this work. It is further recognized that this process does not impact the established management right of employers to determine local job requirements and job descriptions nor does this process alter any existing collective agreement rights or established practices. When the JE plan is ready to be implemented, and if an amendment to an existing collective agreement is required, the JE Committee will work with the local School District and Local Union to make recommendations for implementation. Any recommendations will also be provided to the Provincial Labour Management Committee (PLMC). As mutually agreed by the provincial parties and the JE Committee, the disbursement of available JE funds shall be retroactive to January 2, 2020. The committee will utilize available funds to provide 50% of the wage differential for the position falling the furthest below the wage rate established by the provincial JE process and will continue this process until all JE fund monies at the time have been disbursed. The committee will follow compensation best practices to avoid problems such as inversion. The committee will report out to the provincial parties regularly during the term of the Framework Agreement. Should any concerns arise during the work of the committee they will be referred to the PLMC. Create a maintenance program to support ongoing implementation of the JE plan at a local, regional or provincial level. The maintenance program will include a process for addressing the wage rates of incumbents in positions which are impacted by implementation of the JE plan. The provincial parties confirm that $4,419,859 of ongoing annual funds will be used to implement the Job Evaluation Plan. Effective July 1, 2022, there will be a one-time pause of the annual $4,419,859 JE funding. This amount has been allocated to the local table bargaining money. The annual funding will recommence July 1, 2023.

  • TEACHER EVALUATION A. All monitoring or observation of the work performance of a teacher shall be conducted openly and with full knowledge of the teacher.

  • Annual Evaluation The Partnership will be evaluated on an annual basis through the use of the Strategic Partnership Annual Evaluation Format as specified in Appendix C of OSHA Instruction CSP 00-00-000, OSHA Strategic Partnership Program for Worker Safety and Health. Xxxxxxxxx & Xxxxxx will be responsible for gathering required participant data to evaluate and track the overall results and success of the Partnership. This data will be shared with OSHA. OSHA will be responsible for writing and submitting the annual evaluation.

  • Program Evaluation The School District and the College will develop a plan for the evaluation of the Dual Credit program to be completed each year. The evaluation will include, but is not limited to, disaggregated attendance and retention rates, GPA of high-school-credit-only courses and college courses, satisfactory progress in college courses, state assessment results, SAT/ACT, as applicable, TSIA readiness by grade level, and adequate progress toward the college-readiness of the students in the program. The School District commits to collecting longitudinal data as specified by the College, and making data and performance outcomes available to the College upon request. HB 1638 and SACSCOC require the collection of data points to be longitudinally captured by the School District, in collaboration with the College, will include, at minimum: student enrollment, GPA, retention, persistence, completion, transfer and scholarships. School District will provide parent contact and demographic information to the College upon request for targeted marketing of degree completion or workforce development information to parents of Students. School District agrees to obtain valid FERPA releases drafted to support the supply of such data if deemed required by counsel to either School District or the College. The College conducts and reports regular and ongoing evaluations of the Dual Credit program effectiveness and uses the results for continuous improvement.

Time is Money Join Law Insider Premium to draft better contracts faster.