Background and motivation Clause Samples
The "Background and Motivation" clause serves to provide context and explain the reasons behind entering into the agreement or undertaking the project. It typically outlines the circumstances, objectives, or business needs that led the parties to formalize their relationship, such as a shared goal to develop a new product or address a market opportunity. By clearly stating the underlying motivations, this clause helps ensure that all parties have a mutual understanding of the agreement's purpose, reducing the risk of misunderstandings and aligning expectations throughout the collaboration.
Background and motivation. Global CO2 emissions from the consumption of fossil oil have increased dramatically from 22,188.5 million tons in 1995 to 33,508.4 million tons in 2015, with an annual average rate of 2.1%.1 In current global energy consumption, fossil fuel-based energies still provide approximately 86.0% of the global total energy needs.1,2 To solve this problem, hydrogen, an attractive energy carrier with high energy density (140 MJ/kg) which is more than two times higher than typical solid fuels (50 MJ/kg), has been recognized as a promising alternative to replace fossil oil used in the industry and transportation.3 In addition, hydrogen has versatile significant applications in the traditional industry such as petroleum refinement, ammonia fertilizer, metal refinement, and heating.4 Demand for hydrogen in the United States is projected to grow from 60 million to nearly 700 million metric tons from now to the mid-century, even without considering the rapid development of fuel cell electric vehicles.4 The Hydrogen Council has made a comprehensive assessment of the future potential impact of the hydrogen economy. In the report, hydrogen energy is believed to be able to meet 18% of the world’s energy demand, create a $2.5 trillion market, and reduce carbon dioxide emissions by 40–60% in transportation, industry, and residential.5 Although hydrogen is a renewable “carbon-zero” fuel, 96% of the current hydrogen is produced from the steam reforming of nonrenewable fossil fuels (methane, coal, and oil) with high energy consumption and CO2 emission.6 Moreover, due to the nature of the steam reforming reaction, impurities such as CO or H2S are inevitable in the produced H2. Trace amounts of such impurities can severely poison the platinum (Pt) based catalysts currently used in proton exchange membrane fuel cells (PEMFCs).7,8 Therefore, combined with renewable energy, electrochemical and photoelectrochemical hydrogen production has attracted considerable interest worldwide as the alternative, environmentally friendly long-term pathway to produce high purity H2 on a large scale, as suggested by the Department of Energy (DOE) in the United States (Figure 1).
Background and motivation. The nature of today’s global competitive market has given rise to increased organizational cooperation in form of strategic alliances where organizations no longer compete in isolation, but as value chains. Globalization and increased market pressures lead organizations to enter into strategic partnerships with the overall goal of achieving a competitive advantage. Through aligning resources and capabilities with business partners, mutual benefits can be gained in form of quality, time, and costs. The realization of such collaborative efforts requires integrated behavior, sharing of information, and appropriate management of business relationships. As a result, the concept of Supply Chain Management (SCM) has been flourishing the last decade. The objective of SCM is in short to coordinate activities between businesses across traditional organizational boundaries to improve the performance of the supply chain partners and the supply chain as a whole. Another closely related concept which has been reaping increased attention the last decade is the role of information technology (IT) in inter-organizational business activities. The use of such inter-organizational information systems (IOS) has become central for business collaboration, and the different systems range from simple web portals to extensive integrated electronic networks. Recent and continuous advances in these technological solutions offer new ways to collaborate and compete inter-organizationally. And, in view of the fact that these technological solutions are becoming so common and easy to procure, organizations that are late in adopting such solutions might fall behind in the competitive environment of today’s markets. There is an interception between the two concepts of SCM and IOS. As ▇▇▇▇▇▇ (2007) notes, IOS are critical in managing operational and strategic activities between organizations as they can provide the supply chain partners with real-time, critical information of demand and supply data. ▇▇▇▇▇▇ and ▇▇▇▇▇▇▇▇▇▇▇▇▇▇ (1998) take it even further by saying that coordinated business activities, integrated behavior, and sharing of information between organizations requires the use of an IOS. Hence, IOS can be viewed as an essential enabler of effective management of the supply chain (i.e. SCM). However, the majority of IOS projects is costly and might even be the largest investment an organization goes through with ever (▇▇▇▇▇▇, 2005). The importance of ensuring the IOS’s success is t...
Background and motivation. Co-simulation techniques are popular in the design of cyber-physical sys- tems (CPS) [18]. Such systems are typically engineered using a variety of languages and tools that adopt complementary paradigms; examples are physics-related models, control laws, and sequential, concurrent and real-time programs. This diversity makes CPS generally difficult to analyse and study. The Functional Mock-up Interface (FMI) Standard [11] has been proposed to alleviate that problem and has since been successfully used in industry. It addresses the challenge of interoperability, coupling different simulators and their high-level control components via a bespoke FMI API1. While (co)simulation is currently the predominant approach to analyse CPS, this report describes a proof-based complementary technique that uses math- ematical reasoning and logic. Simulation is useful in helping engineers to un- derstand modelling implications and spot design issues, but cannot provide universal guarantees of correctness and safety. It is usually impossible to run an exhaustive number of simulations as a way of testing the system. For these reasons, it is often not clear how the evidence provided by simulations is to be qualified, since simulations depend on parameters and algorithms, and are software systems (with possible faults) in their own right. Proof-based techniques, on the other hand, hold the promise of making uni- versal claims about systems. They can potentially abstract from particular simulation scenarios, parametrisations of models, and interaction patterns used for testing. In traditional software engineering, they have been success- fully used to validate the correctness of implementations against abstract requirements models [5]. Yet, their application to CPS is fraught with diffi- culties: the heterogeneous combination of languages used in typical descrip- tions of CPS raises issues of semantic integration and complexity in reasoning 1Abstract Programming Interface about those models. The aspiring ideal of any verification technique is a com- positional approach, and such approaches are still rare for CPS [30].
Background and motivation. Within the Nunataryuk project, a vast amount and diversity of data will be produced. The purpose of the DMP is to document how the data generated within the project is handled during and after the project. It describes the basic principles for data management within the project. This includes standards and generation of discovery and use metadata, data sharing and preservation and life cycle management. This DMP is a living document that will be updated during the project in time with the periodic reports. Nunataryuk is following the principles outlined by the Open Research Data Pilot (OpenAIRE) and The FAIR Guiding Principles for scientific data management and stewardship (▇▇▇▇▇▇▇▇▇ et al. 20161).
Background and motivation. Synoptic reporting [82-84] has become a powerful tool for providing summarized findings through predefined data element templates such as CAP Cancer Protocols [4]. Meanwhile, standard groups such as IHE are proposing structured reporting standards such as Anatomic Pathology Structured Reports [3] in HL7. While the community is tending towards structured reporting, a vast amount of pathology reports exists in legacy systems in unstructured format, and the standardization effort only captures major data elements, leaving useful research information in free text that is difficult to process and search. We explore the adaptive vocabulary feature of IDEAL-X, which employs an initial controlled vocabulary that is continuously refined through online learning during the extraction process. We also provide a built-in query interface to support searching patients based on extracted data elements.
Background and motivation. In recent years, data-driven dialogue systems such as BlenderBot [17] and Chat- GPT1, which utilize large seq-to-seq language models [3, 9, 15], have garnered signif- icant interest from various communities. The applications of these dialogue systems are seemingly endless, with numerous organizations processing years’ worth of audio recordings from human-to-human dialogues to train their models. However, these audio recordings were often collected without the intention of data-driven model de- velopment, resulting in low-quality audio with considerable background noise that makes automatic speech recognition (ASR) challenging. Moreover, these recordings typically use a single channel for all speakers rather than assigning dedicated channels to individual speakers, necessitating the use of speaker diarization (SD), an additional challenging task. SD is a speech processing task that identifies the speakers of audio segments ex- tracted from a conversation involving two or more speakers [13]. Despite the excellent performance of ASR models in translating audio into text without recognizing indi- vidual speakers [2, 5, 14], unstable SD can have a detrimental effect on developing 1▇▇▇▇▇://▇▇▇▇▇▇.▇▇▇/blog/chatgpt robust dialogue models, as any model trained on such data would fail to learn unique languages for distinct speakers. Therefore, analyzing the performance of ASR and SD on specific audio streams is crucial for producing high-quality transcripts. However, there has been a lack of comprehensive approaches to simultaneously evaluate both types of errors.
Background and motivation. Looking back in history, trying to trace back this ever-increasing ab- straction that we may call mathematics, we can clearly see that there are two fundamental concepts; shapes and numbers. Number theory, and especially the study of Diophantine equations, can be considered a core mathematical study. We recall that a Diophantine equation is a system of polynomial equa- tions over a field K, and therefore it can be thought of as a subset of the affine space Kn. This simple idea gave number theorists a whole new arsenal of techniques to tackle old problems. It also paved the path to new connections in mathematics. Probably, the best example is ▇▇▇▇▇▇’s Last Theorem, which remained open for more than three centuries until ▇▇▇▇▇ gave his famous proof in 1995. This interplay between number theory and algebraic geometry can be used to find a natural, though unexpected, way to categorise Diophantine equa- tions; the dimension of the zero locus. For example, we can restrict ourselves to one-dimensional varieties, or curves. Examples of curves are: x + y + z = 0 in P2 and y2 − xz = yw − z2 = xw − yz = 0 in P3. These two serve as a fine example of a connection that would not have been possible without the use of Algebraic Geometry in Number Theory. A finer categorisation for curves is the genus. Curves are far more studied and understood than higher dimensional varieties, but even in this case, the only genera that we have a good understanding of are 0 and 1. If the genus of the curve over a number field K is higher than 1, we still have some deep results, such as Faltings’ Theorem that asserts that the curve has only finitely many points. Elliptic curves (smooth curves of genus 1 that have a K rational point) have formed a paradigm on the way to look for results in Diophantine equations. For a number field K, the set of K-rational points on an elliptic curve E defined over K forms a finitely generated abelian group; this is the famous Mordell–Weil Theorem. This thesis is concerned with analogues of the Mordell–Weil theorem in higher dimensions, building on recent advances in the two-dimensional case, due to Siksek [16] and ▇▇▇▇▇▇ [2], [3].
Background and motivation. Emory University Cardiovascular Biobank aims to address a variety of research questions in cardiovascular diseases. It is a registry of patients with suspected or confirmed coronary artery disease undergoing cardiac catheterization. The final database will store approximately 12,000 patients’ records, and will contain information from eight sources including major Emory Healthcare units. Apart from the data collected with standardized questionnaire, clinical data is collected from up to eight types of reports: Cardiac Catheterization Procedure Report, Echocardiogram Report, History and Physical Report, Discharge Summary, Outpatient Clinic Note, Outpatient Clinic Letter, Coronary Angiogram Report, and Inpatient Report as well as Discharge Medication lists. Data elements extracted from reports and structured records are integrated to provide comprehensive information for patient identification. Manual extraction of the data is infeasible due to the large number of reports.
Background and motivation. Blockchain is a system of recording information in a way that makes it difficult or impossible to change, hack, or cheat the system. A blockchain is essentially a digital ledger of transactions that is duplicated and distributed across the entire network of computer systems on the blockchain. Each block in the chain contains a number of transactions, and every time a new transaction occurs on the blockchain, a record of that transaction is added to every participant’s ledger. The decentralized database managed by multiple participants is known as Distributed Ledger Technology (DLT). This synchronization dependent on hashes is not quite the same as like as conventions that depend on looking at two variants of a similar record line by line and can distinguish the correct updates and just transfer these updates in a fix style. Rather, the hash based synchronization demonstrates necessities to transfer the entire lumps with changed hashes to the cloud. Subsequently, in the multi cloud condition, two pieces varying just somewhat can be conveyed to two distinct mists.
Background and motivation. The use of a hybrid membrane-liquefaction process for post-combustion CO2 capture can potentially be more cost effective compared to two-stage membrane processes or standard MEA absorption processes [1]. In the membrane-assisted CO2 liquefaction (MAL) process, the two different separation technologies can each carry out a partial separation within its favourable regime of operation. The membrane separation is generally suited for bulk separation with moderate product purity, while the low-temperature liquefaction process is well suited for purification of the CO2 stream, from moderate purity to a high-purity product by phase separation, as described in CEMCAP deliverable number D11.3 [2]. An advantage of this process is that there are no requirements for process steam, which is normally not available in cement plants. The MAL process has been investigated by process simulations and laboratory experiments to validate its performance. Focus has been on obtainable carbon capture ratio (CCR), CO2 product purity and main process parameters (e.g. temperature, pressure and retention time in separation vessels) for synthesized binary membrane permeate-gas compositions. To increase the TRL level of the MAL capture technology for cement plants to 7–8, a test plant with real flue gas from a cement plant in an operational environment is required. In this work, a test plant design has been proposed and simulated in Aspen HYSYS. The necessary main components, equipment types and availability of off-the-shelf equipment has been investigated. Suggestions on how the plant can be operated, based on experience from the laboratory experiments are also provided.