Implementing High Performance Data Analytics Sample Clauses

Implementing High Performance Data Analytics. (HPDA) Task 5.4 starts in M25 and focusses on applications centred around the integration of HPDA techniques and simulation with clinical and/or experimental data. The first activities of this task will be focused on developing an incubation model specific for these types of applications and provide guidance to external users. We will gather information from the HPDA applications in the CompBioMed software stack and select exemplar cases, from both phases of CompBioMed, to develop the incubation model. During the first part of the project, several molecular research applications were developed which involved the creation of machine learning based models using extensive experimental datasets. These applications presented unique challenges in their development, validation and deployment. This task will leverage that experience to guide newly developed applications based on HPDA techniques, such as deep learning-based models or gaussian processes. There are also activities currently ongoing in the project related to the development of HPDA applications. In WP2, under Molecularly-based Medicine Exemplar Research (Task 2.2) there is a specific subtask working on combining Molecular Dynamics simulation data with state-of-the-art deep learning models, looking to solve the sampling issues of protein simulation. Activity associated with the development of Alya, described in Section 8.3.1 above also involves collaboration with cardiologists (x.x. Xxxxx Xxxx’Armellina in Leeds, Xxxx Xxxxx in Oxford) and medical image analysis team (e.g. Xxxxxxx Xxxx in UOXF as part of T3.4 and T3.5) to develop HPDA tools for analysing combination of data from clinics and our computer simulations. These exemplar activities with internal applications will be used to inform best practice guidance for HPDA techniques centred around data availability, curation and pre-processing. A particular focus for Machine Learning applications will be on the assessment of validation and/or reproducibility which imposes specific requirements on the level of information provided to describe the datasets used for research applications.
AutoNDA by SimpleDocs

Related to Implementing High Performance Data Analytics

  • PERFORMANCE MANAGEMENT SYSTEM 5.1 The Employee agrees to participate in the performance management system that the Employer adopts or introduces for the Employer, management and municipal staff of the Employer.

  • Trunk Group Architecture and Traffic Routing The Parties shall jointly engineer and configure Local/IntraLATA Trunks over the physical Interconnection arrangements as follows:

  • Information Technology Enterprise Architecture Requirements If this Contract involves information technology-related products or services, the Contractor agrees that all such products or services are compatible with any of the technology standards found at xxxxx://xxx.xx.xxx/iot/2394.htm that are applicable, including the assistive technology standard. The State may terminate this Contract for default if the terms of this paragraph are breached.

  • SERVICE MONITORING, ANALYSES AND ORACLE SOFTWARE 11.1 We continuously monitor the Services to facilitate Oracle’s operation of the Services; to help resolve Your service requests; to detect and address threats to the functionality, security, integrity, and availability of the Services as well as any content, data, or applications in the Services; and to detect and address illegal acts or violations of the Acceptable Use Policy. Oracle monitoring tools do not collect or store any of Your Content residing in the Services, except as needed for such purposes. Oracle does not monitor, and does not address issues with, non-Oracle software provided by You or any of Your Users that is stored in, or run on or through, the Services. Information collected by Oracle monitoring tools (excluding Your Content) may also be used to assist in managing Oracle’s product and service portfolio, to help Oracle address deficiencies in its product and service offerings, and for license management purposes.

  • Performance Management 17.1 The Contractor will appoint a suitable Account Manager to liaise with the Authority’s Strategic Contract Manager. Any/all changes to the terms and conditions of the Agreement will be agreed in writing between the Authority’s Strategic Contract Manager and the Contractor’s appointed representative.

  • Information Technology Accessibility Standards Any information technology related products or services purchased, used or maintained through this Grant must be compatible with the principles and goals contained in the Electronic and Information Technology Accessibility Standards adopted by the Architectural and Transportation Barriers Compliance Board under Section 508 of the federal Rehabilitation Act of 1973 (29 U.S.C. §794d), as amended. The federal Electronic and Information Technology Accessibility Standards can be found at: xxxx://xxx.xxxxxx-xxxxx.xxx/508.htm.

  • Infrastructure Vulnerability Scanning Supplier will scan its internal environments (e.g., servers, network devices, etc.) related to Deliverables monthly and external environments related to Deliverables weekly. Supplier will have a defined process to address any findings but will ensure that any high-risk vulnerabilities are addressed within 30 days.

  • Performance Monitoring A. Performance Monitoring of Subrecipient by County, State of California and/or HUD shall consist of requested and/or required written reporting, as well as onsite monitoring by County, State of California or HUD representatives.

  • Quarterly Contractor Performance Reporting Customers shall complete a Contractor Performance Survey (Exhibit I) for each Contractor on a Quarterly basis. Customers will electronically submit the completed Contractor Performance Survey(s) to the Department Contract Manager no later than the due date indicated in Contract Exhibit D, Section 17, Additional Special Contract Conditions. The completed Contractor Performance Survey(s) will be used by the Department as a performance-reporting tool to measure the performance of Contractors. The Department reserves the right to modify the Contractor Performance Survey document and introduce additional performance-reporting tools as they are developed, including online tools (e.g. tools within MyFloridaMarketPlace or on the Department's website).

  • CONTRACTOR PERFORMANCE AUDIT The Contractor shall allow the Authorized User to assess Contractor’s performance by providing any materials requested in the Authorized User Agreement (e.g., page load times, response times, uptime, and fail over time). The Authorized User may perform this Contractor performance audit with a third party at its discretion, at the Authorized User’s expense. The Contractor shall perform an independent audit of its Data Centers, at least annually, at Contractor expense. The Contractor will provide a data owner facing audit report upon request by the Authorized User. The Contractor shall identify any confidential, trade secret, or proprietary information in accordance with Appendix B, Section 9(a), Confidential/Trade Secret Materials.

Time is Money Join Law Insider Premium to draft better contracts faster.