Background and Motivation Sample Clauses

Background and Motivation. Global CO2 emissions from the consumption of fossil oil have increased dramatically from 22,188.5 million tons in 1995 to 33,508.4 million tons in 2015, with an annual average rate of 2.1%.1 In current global energy consumption, fossil fuel-based energies still provide approximately 86.0% of the global total energy needs.1,2 To solve this problem, hydrogen, an attractive energy carrier with high energy density (140 MJ/kg) which is more than two times higher than typical solid fuels (50 MJ/kg), has been recognized as a promising alternative to replace fossil oil used in the industry and transportation.3 In addition, hydrogen has versatile significant applications in the traditional industry such as petroleum refinement, ammonia fertilizer, metal refinement, and heating.4 Demand for hydrogen in the United States is projected to grow from 60 million to nearly 700 million metric tons from now to the mid-century, even without considering the rapid development of fuel cell electric vehicles.4 The Hydrogen Council has made a comprehensive assessment of the future potential impact of the hydrogen economy. In the report, hydrogen energy is believed to be able to meet 18% of the world's energy demand, create a $2.5 trillion market, and reduce carbon dioxide emissions by 40–60% in transportation, industry, and residential.5 Although hydrogen is a renewable “carbon-zero” fuel, 96% of the current hydrogen is produced from the steam reforming of nonrenewable fossil fuels (methane, coal, and oil) with high energy consumption and CO2 emission.6 Moreover, due to the nature of the steam reforming reaction, impurities such as CO or H2S are inevitable in the produced H2. Trace amounts of such impurities can severely poison the platinum (Pt) based catalysts currently used in proton exchange membrane fuel cells (PEMFCs).7,8 Therefore, combined with renewable energy, the electrochemical and photoelectrochemical hydrogen production has attracted considerable interest worldwide as the alternative, environmentally friendly long term pathway to produce high purity H2 on a large scale, as suggested by the Department of Energy (DOE) in the United States (Figure 1).
AutoNDA by SimpleDocs
Background and Motivation. Recent history with transient-execution side channels proves the need for clear specification of the limits of speculative execution to support both software reasoning and hardware development.
Background and Motivation. The nature of today’s global competitive market has given rise to increased organizational cooperation in form of strategic alliances where organizations no longer compete in isolation, but as value chains. Globalization and increased market pressures lead organizations to enter into strategic partnerships with the overall goal of achieving a competitive advantage. Through aligning resources and capabilities with business partners, mutual benefits can be gained in form of quality, time, and costs. The realization of such collaborative efforts requires integrated behavior, sharing of information, and appropriate management of business relationships. As a result, the concept of Supply Chain Management (SCM) has been flourishing the last decade. The objective of SCM is in short to coordinate activities between businesses across traditional organizational boundaries to improve the performance of the supply chain partners and the supply chain as a whole. Another closely related concept which has been reaping increased attention the last decade is the role of information technology (IT) in inter-organizational business activities. The use of such inter-organizational information systems (IOS) has become central for business collaboration, and the different systems range from simple web portals to extensive integrated electronic networks. Recent and continuous advances in these technological solutions offer new ways to collaborate and compete inter-organizationally. And, in view of the fact that these technological solutions are becoming so common and easy to procure, organizations that are late in adopting such solutions might fall behind in the competitive environment of today’s markets. There is an interception between the two concepts of SCM and IOS. As Xxxxxx (2007) notes, IOS are critical in managing operational and strategic activities between organizations as they can provide the supply chain partners with real-time, critical information of demand and supply data. Xxxxxx and Xxxxxxxxxxxxxx (1998) take it even further by saying that coordinated business activities, integrated behavior, and sharing of information between organizations requires the use of an IOS. Hence, IOS can be viewed as an essential enabler of effective management of the supply chain (i.e. SCM). However, the majority of IOS projects is costly and might even be the largest investment an organization goes through with ever (Xxxxxx, 2005). The importance of ensuring the IOS’s success is t...
Background and Motivation. Blockchain is a system of recording information in a way that makes it difficult or impossible to change, hack, or cheat the system. A blockchain is essentially a digital ledger of transactions that is duplicated and distributed across the entire network of computer systems on the blockchain. Each block in the chain contains a number of transactions, and every time a new transaction occurs on the blockchain, a record of that transaction is added to every participant’s ledger. The decentralized database managed by multiple participants is known as Distributed Ledger Technology (DLT). This synchronization dependent on hashes is not quite the same as like as conventions that depend on looking at two variants of a similar record line by line and can distinguish the correct updates and just transfer these updates in a fix style. Rather, the hash based synchronization demonstrates necessities to transfer the entire lumps with changed hashes to the cloud. Subsequently, in the multi cloud condition, two pieces varying just somewhat can be conveyed to two distinct mists.
Background and Motivation. Implemented in the Constraint-Based Semantics Project, Xerox PARC – LFG parse instantiates lexical items from a semantic lexicon (meaning constructors) • Issues raised by LFG treatment of coordination:
Background and Motivation. Within the Nunataryuk project, a vast amount and diversity of data will be produced. The purpose of the DMP is to document how the data generated within the project is handled during and after the project. It describes the basic principles for data management within the project. This includes standards and generation of discovery and use metadata, data sharing and preservation and life cycle management. This DMP is a living document that will be updated during the project in time with the periodic reports. Nunataryuk is following the principles outlined by the Open Research Data Pilot (OpenAIRE) and The FAIR Guiding Principles for scientific data management and stewardship (Xxxxxxxxx et al. 20161).
Background and Motivation. 1.1.1 Introduction to complex networks Complex systems are ubiquitous throughout the world, both in nature and within man- made structures. Over the past decade, large amounts of network data have become available and, correspondingly, the analysis of complex networks has become increas- ingly important. In response, researchers in diverse areas have focused their attention on such problems and the study of complex systems has begun to develop into a disci- xxxxx in its own right [13, 21, 24, 40, 41, 84, 85]. Mathematicians, physicists, computer scientists, biologists, and social scientists (among others) have approached these prob- lems from various angles[2, 28, 42, 43, 67, 88]. This large amount of research within diverse scientific communities has produced many interesting results, but there are still many fundamental questions that are not fully answered. Additionally, as available datasets become larger and larger, efficient data structures and computational meth- ods become paramount. One of the fundamental questions in network analysis is to determine themost important" elements in a given network. The interpretation of what is meant by “im- portant" can change from application to application. Measures of node importance are usually referred to as node centralities, and many centrality measures have been pro- posed, starting with the simplest of all, the node degree. This crude metric has the drawback of being too “local", as it does not take into effect the connectivity of the im- mediate neighbors of the node under consideration. A number of more sophisticated centrality measures have been introduced that take into account the global connectivity properties of the network. These include various types of eigenvector centrality for both directed and undirected networks, betweenness centrality, and many others. Overviews of various centrality measures can be found in [13, 16, 21, 40, 72, 84, 85]. The central- ity scores can be used to provide rankings of the nodes in the network. There are many different ranking methods in use (most of which depend on centrality measures), and many algorithms have been developed to compute these rankings. Information about the many different ranking schemes can be found in [8, 40, 67, 68, 72, 73, 74, 75].
AutoNDA by SimpleDocs
Background and Motivation. Looking back in history, trying to trace back this ever-increasing ab- straction that we may call mathematics, we can clearly see that there are two fundamental concepts; shapes and numbers. Number theory, and especially the study of Diophantine equations, can be considered a core mathematical study. We recall that a Diophantine equation is a system of polynomial equa- tions over a field K, and therefore it can be thought of as a subset of the affine space Kn. This simple idea gave number theorists a whole new arsenal of techniques to tackle old problems. It also paved the path to new connections in mathematics. Probably, the best example is Xxxxxx’s Last Theorem, which remained open for more than three centuries until Xxxxx gave his famous proof in 1995. This interplay between number theory and algebraic geometry can be used to find a natural, though unexpected, way to categorise Diophantine equa- tions; the dimension of the zero locus. For example, we can restrict ourselves to one-dimensional varieties, or curves. Examples of curves are: x + y + z = 0 in P2 and y2 − xz = yw − z2 = xw − yz = 0 in P3. These two serve as a fine example of a connection that would not have been possible without the use of Algebraic Geometry in Number Theory. A finer categorisation for curves is the genus. Curves are far more studied and understood than higher dimensional varieties, but even in this case, the only genera that we have a good understanding of are 0 and 1. If the genus of the curve over a number field K is higher than 1, we still have some deep results, such as Faltings’ Theorem that asserts that the curve has only finitely many points. Elliptic curves (smooth curves of genus 1 that have a K rational point) have formed a paradigm on the way to look for results in Diophantine equations. For a number field K, the set of K-rational points on an elliptic curve E defined over K forms a finitely generated abelian group; this is the famous Mordell–Weil Theorem. This thesis is concerned with analogues of the Mordell–Weil theorem in higher dimensions, building on recent advances in the two-dimensional case, due to Siksek [16] and Xxxxxx [2], [3].
Background and Motivation. ‌ Co-simulation techniques are popular in the design of cyber-physical sys- tems (CPS) [18]. Such systems are typically engineered using a variety of languages and tools that adopt complementary paradigms; examples are physics-related models, control laws, and sequential, concurrent and real-time programs. This diversity makes CPS generally difficult to analyse and study. The Functional Mock-up Interface (FMI) Standard [11] has been proposed to alleviate that problem and has since been successfully used in industry. It addresses the challenge of interoperability, coupling different simulators and their high-level control components via a bespoke FMI API1. While (co)simulation is currently the predominant approach to analyse CPS, this report describes a proof-based complementary technique that uses math- ematical reasoning and logic. Simulation is useful in helping engineers to un- derstand modelling implications and spot design issues, but cannot provide universal guarantees of correctness and safety. It is usually impossible to run an exhaustive number of simulations as a way of testing the system. For these reasons, it is often not clear how the evidence provided by simulations is to be qualified, since simulations depend on parameters and algorithms, and are software systems (with possible faults) in their own right. Proof-based techniques, on the other hand, hold the promise of making uni- versal claims about systems. They can potentially abstract from particular simulation scenarios, parametrisations of models, and interaction patterns used for testing. In traditional software engineering, they have been success- fully used to validate the correctness of implementations against abstract requirements models [5]. Yet, their application to CPS is fraught with diffi- culties: the heterogeneous combination of languages used in typical descrip- tions of CPS raises issues of semantic integration and complexity in reasoning 1Abstract Programming Interface about those models. The aspiring ideal of any verification technique is a com- positional approach, and such approaches are still rare for CPS [30].
Background and Motivation. Gilthead Seabream and Sea bass are the most important finfish species farmed in the Mediterranean Sea. Until the 1980s, Seabream was only fished from wild populations, but successful reproductions and intensive rearing resulted in a rapid increase in production. Already in 1993, culturing in cages exceeded that of open sea fishing. Seabream is the largest marine farmed fish in the Mediterranean, they usually weigh between 400 and 600 g and are sold fresh, whole or eviscerated. The major producing countries within the EU are Greece, Spain, and Italy. France is the fourth largest producer of seabream juveniles. Turkey is the second major producer in the Mediterranean. In 2015, seabream accounted for 12% of the total production, in terms of value and volume for the EU marine aquaculture. Recently, the volume in the production of seabream fry and juveniles has increased, and costs reductions were achieved by automation. The finfish aquaculture industry has heavily invested in farming technologies and automation to improve quality, food safety and traceability of produced fish. A further increase in production volumes rely on environmentally friendly approaches to aquaculture, which are mandatory for production licences and commercially viable production. The ambition for the business case developed here in the Space@Sea project for Sea bream aquaculture is to increase sustainable food production at large scale production in offshore conditions by making use of modular floating islands. An increase of food production at sea fits to the EU Blue Growth strategy. An important aspect also is to reduce the environmental footprint by making use of a closed type of aquaculture system, minimising interactions (water intake and discharge) with the marine environment. Closed systems also enable to keep out parasites and to control disease outbreaks.
Time is Money Join Law Insider Premium to draft better contracts faster.