Background and motivation Sample Clauses
The "Background and Motivation" clause serves to provide context and explain the reasons behind entering into the agreement or undertaking the project. It typically outlines the circumstances, objectives, or business needs that led the parties to formalize their relationship, such as a shared goal to develop a new product or address a market opportunity. By clearly stating the underlying motivations, this clause helps ensure that all parties have a mutual understanding of the agreement's purpose, reducing the risk of misunderstandings and aligning expectations throughout the collaboration.
Background and motivation. Global CO2 emissions from the consumption of fossil oil have increased dramatically from 22,188.5 million tons in 1995 to 33,508.4 million tons in 2015, with an annual average rate of 2.1%.1 In current global energy consumption, fossil fuel-based energies still provide approximately 86.0% of the global total energy needs.1,2 To solve this problem, hydrogen, an attractive energy carrier with high energy density (140 MJ/kg) which is more than two times higher than typical solid fuels (50 MJ/kg), has been recognized as a promising alternative to replace fossil oil used in the industry and transportation.3 In addition, hydrogen has versatile significant applications in the traditional industry such as petroleum refinement, ammonia fertilizer, metal refinement, and heating.4 Demand for hydrogen in the United States is projected to grow from 60 million to nearly 700 million metric tons from now to the mid-century, even without considering the rapid development of fuel cell electric vehicles.4 The Hydrogen Council has made a comprehensive assessment of the future potential impact of the hydrogen economy. In the report, hydrogen energy is believed to be able to meet 18% of the world’s energy demand, create a $2.5 trillion market, and reduce carbon dioxide emissions by 40–60% in transportation, industry, and residential.5 Although hydrogen is a renewable “carbon-zero” fuel, 96% of the current hydrogen is produced from the steam reforming of nonrenewable fossil fuels (methane, coal, and oil) with high energy consumption and CO2 emission.6 Moreover, due to the nature of the steam reforming reaction, impurities such as CO or H2S are inevitable in the produced H2. Trace amounts of such impurities can severely poison the platinum (Pt) based catalysts currently used in proton exchange membrane fuel cells (PEMFCs).7,8 Therefore, combined with renewable energy, electrochemical and photoelectrochemical hydrogen production has attracted considerable interest worldwide as the alternative, environmentally friendly long-term pathway to produce high purity H2 on a large scale, as suggested by the Department of Energy (DOE) in the United States (Figure 1).
Background and motivation. Despite the overwhelming cost of fossil fuel, commercial photovoltaic solar cells account for less than 0.1% of the energy consumption in the US. This is partially due to the low conversion efficiency (~15%) and high installation cost of the current solar cell technology (~7$/W), far exceeding the generation of electricity from fossil fuel. In this context, semiconductor nonmaterial has promising applications in solar cell technology as they offer good photostability and conversion. However, to date, no significant advances have been achieved due either to size nonuniformity, low yield, or matrix inhomogeneity. Various methods exist for the production of Si nanoparticles, but most produce a wide size distribution. In addition, many methods, e.g. laser ablation, pyrolosis of gas, and ion beam deposition generally produce small quantities of particles, which cannot be readily integrated into subsequent processes and manufacturing scale up.
Background and motivation. Synoptic reporting [82-84] has become a powerful tool for providing summarized findings through predefined data element templates such as CAP Cancer Protocols [4]. Meanwhile, standard groups such as IHE are proposing structured reporting standards such as Anatomic Pathology Structured Reports [3] in HL7. While the community is tending towards structured reporting, a vast amount of pathology reports exists in legacy systems in unstructured format, and the standardization effort only captures major data elements, leaving useful research information in free text that is difficult to process and search. We explore the adaptive vocabulary feature of IDEAL-X, which employs an initial controlled vocabulary that is continuously refined through online learning during the extraction process. We also provide a built-in query interface to support searching patients based on extracted data elements.
Background and motivation. Within the Nunataryuk project, a vast amount and diversity of data will be produced. The purpose of the DMP is to document how the data generated within the project is handled during and after the project. It describes the basic principles for data management within the project. This includes standards and generation of discovery and use metadata, data sharing and preservation and life cycle management. This DMP is a living document that will be updated during the project in time with the periodic reports. Nunataryuk is following the principles outlined by the Open Research Data Pilot (OpenAIRE) and The FAIR Guiding Principles for scientific data management and stewardship (Xxxxxxxxx et al. 20161).
Background and motivation. Users’ expectations to receive high volume and reliable traffic data have shown an unprecedented growth in recent years and is projected to double every year in the current decade [15]. This is predominantly due to new and emerging data- hungry and personal hand-held devices such as tablets and smart phones [13] narrowing user demands between mobile and fixed networks. To satisfy such user demands, International Mobile Telecommunication Advanced (IMT-A), a global standard initiative was introduced by the international telecommunications union (ITU) in 2007 [16]. The IMT-A requires peak DL and UL data rates of 1 Gbps and 500 Mbps for low mobility scenarios, respectively. This was when peak data rates of 300 Mbps and 75 Mbps at a maximum available bandwidth of 20 MHz were supported by the Long Term Evolution (LTE), corresponding to Releases 8 and 9 of the 3rd Generation Partnership Project (3GPP) [17]. Promising to enhance LTE’s performance, LTE advanced (LTE-A) soon was issued as an IMT- A technology two years after its introduction by 3GPP in 2010. Since reachable data rates increase linearly with bandwidth, acquiring more spectrum is a necessity for meeting the ever-growing traffic requirements. LTE-A allows the utilisation of a maximum of 100 MHz system bandwidth. However, due to unavailability of large fragments of contiguous bandwidth, operators seek alternatives to use spectrum chunks at different carrier frequencies and aggregate them for data transmission. First standardised in Release 10 as one of the key fea- tures of 3GPP, CA facilitates the aggregation of fragmented and non-contiguous bandwidth as an expensive and scarce commodity [18] [19] [20]. This thesis only considers CA in the DL.
Background and motivation. Cyber-Physical Systems (CPS) are systems that comprise both real-world entities and digital components. Modelling and designing CPSs typically requires a combination of different languages and tools that adopt comple- mentary specification paradigms. For real-world artefacts, physics models in the form of differential equations are the norm. Digital components, such as software controllers, are typically described via control diagrams, state machines, and real-time programs. This diversity of specification and design methods makes CPS challenging to study and analyse. Co-simulation [16] is perhaps the de facto technique for analysing the be- haviour of CPS. It requires that models of artefacts are simulated in iso- lation, while master algorithms control the various simulators and thereby orchestrate the co-simulation as a whole. This, however, raises issues of interoperability between the master algorithm and simulators. The Func- tional Mock-up Interface (FMI) Standard [11] has been proposed to alleviate those issues, and has since been successfully used in many industrial appli- cations. The FMI standard prescribes how master algorithms (MA) and simulators communicate. It does so by virtue of a bespoke API that simulators have to implement, and that can be used to implement compliant master algorithms. The API enables master algorithms to exchange data between the compo- nents of a co-simulation, called FMUs (Functional Mock-up Units), perform simulation steps, and suitably deal with errors in simulators. It also allows for advanced features such as roll-back of already performed steps. While (co)simulation is currently the predominant approach to validate CPS models, we here describe a complementary technique based on a formal model of an FMI system. Our technique formalises both the master algorithm and the simulated FMUs, and allows for verification of their properties.
Background and motivation. In recent years, data-driven dialogue systems such as BlenderBot [17] and Chat- GPT1, which utilize large seq-to-seq language models [3, 9, 15], have garnered signif- icant interest from various communities. The applications of these dialogue systems are seemingly endless, with numerous organizations processing years’ worth of audio recordings from human-to-human dialogues to train their models. However, these audio recordings were often collected without the intention of data-driven model de- velopment, resulting in low-quality audio with considerable background noise that makes automatic speech recognition (ASR) challenging. Moreover, these recordings typically use a single channel for all speakers rather than assigning dedicated channels to individual speakers, necessitating the use of speaker diarization (SD), an additional challenging task. SD is a speech processing task that identifies the speakers of audio segments ex- tracted from a conversation involving two or more speakers [13]. Despite the excellent performance of ASR models in translating audio into text without recognizing indi- vidual speakers [2, 5, 14], unstable SD can have a detrimental effect on developing 1xxxxx://xxxxxx.xxx/blog/chatgpt robust dialogue models, as any model trained on such data would fail to learn unique languages for distinct speakers. Therefore, analyzing the performance of ASR and SD on specific audio streams is crucial for producing high-quality transcripts. However, there has been a lack of comprehensive approaches to simultaneously evaluate both types of errors.
Background and motivation. Looking back in history, trying to trace back this ever-increasing ab- straction that we may call mathematics, we can clearly see that there are two fundamental concepts; shapes and numbers. Number theory, and especially the study of Diophantine equations, can be considered a core mathematical study. We recall that a Diophantine equation is a system of polynomial equa- tions over a field K, and therefore it can be thought of as a subset of the affine space Kn. This simple idea gave number theorists a whole new arsenal of techniques to tackle old problems. It also paved the path to new connections in mathematics. Probably, the best example is Xxxxxx’s Last Theorem, which remained open for more than three centuries until Xxxxx gave his famous proof in 1995. This interplay between number theory and algebraic geometry can be used to find a natural, though unexpected, way to categorise Diophantine equa- tions; the dimension of the zero locus. For example, we can restrict ourselves to one-dimensional varieties, or curves. Examples of curves are: x + y + z = 0 in P2 and y2 − xz = yw − z2 = xw − yz = 0 in P3. These two serve as a fine example of a connection that would not have been possible without the use of Algebraic Geometry in Number Theory. A finer categorisation for curves is the genus. Curves are far more studied and understood than higher dimensional varieties, but even in this case, the only genera that we have a good understanding of are 0 and 1. If the genus of the curve over a number field K is higher than 1, we still have some deep results, such as Faltings’ Theorem that asserts that the curve has only finitely many points. Elliptic curves (smooth curves of genus 1 that have a K rational point) have formed a paradigm on the way to look for results in Diophantine equations. For a number field K, the set of K-rational points on an elliptic curve E defined over K forms a finitely generated abelian group; this is the famous Mordell–Weil Theorem. This thesis is concerned with analogues of the Mordell–Weil theorem in higher dimensions, building on recent advances in the two-dimensional case, due to Siksek [16] and Xxxxxx [2], [3].
Background and motivation. The nature of today’s global competitive market has given rise to increased organizational cooperation in form of strategic alliances where organizations no longer compete in isolation, but as value chains. Globalization and increased market pressures lead organizations to enter into strategic partnerships with the overall goal of achieving a competitive advantage. Through aligning resources and capabilities with business partners, mutual benefits can be gained in form of quality, time, and costs. The realization of such collaborative efforts requires integrated behavior, sharing of information, and appropriate management of business relationships. As a result, the concept of Supply Chain Management (SCM) has been flourishing the last decade. The objective of SCM is in short to coordinate activities between businesses across traditional organizational boundaries to improve the performance of the supply chain partners and the supply chain as a whole. Another closely related concept which has been reaping increased attention the last decade is the role of information technology (IT) in inter-organizational business activities. The use of such inter-organizational information systems (IOS) has become central for business collaboration, and the different systems range from simple web portals to extensive integrated electronic networks. Recent and continuous advances in these technological solutions offer new ways to collaborate and compete inter-organizationally. And, in view of the fact that these technological solutions are becoming so common and easy to procure, organizations that are late in adopting such solutions might fall behind in the competitive environment of today’s markets. There is an interception between the two concepts of SCM and IOS. As Xxxxxx (2007) notes, IOS are critical in managing operational and strategic activities between organizations as they can provide the supply chain partners with real-time, critical information of demand and supply data. Xxxxxx and Xxxxxxxxxxxxxx (1998) take it even further by saying that coordinated business activities, integrated behavior, and sharing of information between organizations requires the use of an IOS. Hence, IOS can be viewed as an essential enabler of effective management of the supply chain (i.e. SCM). However, the majority of IOS projects is costly and might even be the largest investment an organization goes through with ever (Xxxxxx, 2005). The importance of ensuring the IOS’s success is t...
Background and motivation. Gilthead Seabream and Sea bass are the most important finfish species farmed in the Mediterranean Sea. Until the 1980s, Seabream was only fished from wild populations, but successful reproductions and intensive rearing resulted in a rapid increase in production. Already in 1993, culturing in cages exceeded that of open sea fishing. Seabream is the largest marine farmed fish in the Mediterranean, they usually weigh between 400 and 600 g and are sold fresh, whole or eviscerated. The major producing countries within the EU are Greece, Spain, and Italy. France is the fourth largest producer of seabream juveniles. Turkey is the second major producer in the Mediterranean. In 2015, seabream accounted for 12% of the total production, in terms of value and volume for the EU marine aquaculture. Recently, the volume in the production of seabream fry and juveniles has increased, and costs reductions were achieved by automation. The finfish aquaculture industry has heavily invested in farming technologies and automation to improve quality, food safety and traceability of produced fish. A further increase in production volumes rely on environmentally friendly approaches to aquaculture, which are mandatory for production licences and commercially viable production. The ambition for the business case developed here in the Space@Sea project for Sea bream aquaculture is to increase sustainable food production at large scale production in offshore conditions by making use of modular floating islands. An increase of food production at sea fits to the EU Blue Growth strategy. An important aspect also is to reduce the environmental footprint by making use of a closed type of aquaculture system, minimising interactions (water intake and discharge) with the marine environment. Closed systems also enable to keep out parasites and to control disease outbreaks.