Powered by OpenAIRE graph
Found an issue? Give us feedback

University of Paris-Saclay

University of Paris-Saclay

5 Projects, page 1 of 1
  • Funder: UK Research and Innovation Project Code: MR/Y003845/1
    Funder Contribution: 532,780 GBP

    Thermal machines like car engines, airplane turbines and household refrigerators have long been essential to our modern society. By converting heat into mechanical work or vice versa, they set cars and airplanes in motion, drive the generators that deliver electricity to our computers and cool our food, living spaces and data centers. None of these modern applications would be possible without one fundamental theory that emerged 200 years ago and has since then enabled engineers to develop more and more advanced machines: thermodynamics. Equipped with a few elementary concepts and laws, this theory lays down the basic rules that govern the performance of James Watt's 18th century steam engine and today's car engines alike. With the next technological revolution underway in the nano and quantum world, there is now an increasing need to develop a new generation of thermal machines that operate on extremely small length-scales to propel nano-robots or cool the building blocks of quantum computers that require ultra-low working temperatures. The last decade has seen a series of landmark experiments, in which ever smaller thermal machines were realized down to the level of single atoms. Such tiny objects are no longer bound by the rules of the classical world; they can occupy two places at the same time or influence each other at a distance without direct interaction. These phenomena are manifestations of the quantum laws of motion that govern the world at atomic scales. The discipline that aims to describe thermal machines operating in this world and seeks to harness their technological potential has been called quantum thermodynamics and forms my main area of research. Technological applications of quantum thermal machines are still facing major conceptual and practical challenges. One of these challenges is their limited energy turnover, which is too small to match the needs of most currently envisaged applications by several orders of magnitude. The key idea underpinning my fellowship is to address this problem by harnessing the properties of collective states of matter, which emerge when large numbers of quantum objects begin to behave in a coordinated way, somewhat similar to a flock of birds. Laying the theoretical groundwork to realize new types of quantum thermal machines that exploit these phenomena to enhance their performance is the central aim of my research program. Building on our results so far, my team, my partners in theory and experiment and I are working on three major topics, which are connected by the theme of seeking synergies between quantum and classical physics. First, to develop the methods required to describe quantum systems hosting collective effects, we investigate classical analogues of these systems, which can be efficiently simulated with classical computers; this idea is similar to using classical water waves as models for the wave character of quantum particles. Second, with the aim of integrating collective quantum thermal machines with classical consumers of their output, we investigate how thermodynamic quantities, like the work produced by a heat engine, can be transmitted from the quantum world into the classical one. Third, to find quantitative measures for the thermodynamic advantage generated by collective quantum effects, we explore how these phenomena make it possible to overcome general trade-off relations that constrain the power, efficiency and precision of classical small-scale thermal machines such as molecular motors. Quantum technologies are widely expected to shape our century in a similar way as the industrial revolution changed 19th and 20th century. Collective quantum thermal machines, for the development of which we are helping to lay the conceptual foundations, have the potential of becoming the steam engines of this development. They will not move our future cars, but they might well help to run our quantum computers and encryption devices.

    more_vert
  • Funder: UK Research and Innovation Project Code: MR/Y011767/1
    Funder Contribution: 592,391 GBP

    Explosive volcanic eruptions have devastating impacts in near-vent areas where pyroclastic density currents can cause significant loss of life, yet the injection of large volumes of ash into the atmosphere and its subsequent dispersal over hundreds to thousands of kilometres, pose significant and far-reaching hazards. Ash fall is a severe and wide-ranging volcanic hazard; causing roof collapse, health (respiratory) and agricultural issues and wide-scale interruptions to essential infrastructure. Even ash emitted during moderately explosive eruptions can ground air traffic as was demonstrated by the 2010 Eyjafjallajökull eruption (Iceland). As such widespread volcanic ash dispersals present huge economic and societal costs. Disturbingly, 800 million people live within 100 km of active volcanoes globally, yet statistical studies of global eruption databases indicate significant under-recording of past volcanic eruptions deeper in time. For instance, this analysis would indicate up to 66% of VEI 5 eruptions, equivalent in scale to the 1980 Mount St Helens eruption, are missing within the geological record spanning the last 200,000 years. Our understanding of the magnitude and frequency of eruptions at a particular volcano is typically skewed to recent activities, because records of older eruptions are fragmentary often owing to erosion and/or burial by more recent eruptions. The better-preserved, shorter-term records, however, do not necessarily reflect the full range of volcanic activity, or variations in the tempo of activity. This is a major obstacle for long-term volcanic hazard assessments and hampers our ability to: i) determine changing eruption-rates through time, ii) evaluate magnitude-frequency relationships and iii) project the recurrence intervals of hazardous ash dispersals. This research has overcome this impasse by reconstruct comprehensive long-term records of explosive volcanism for volcanoes in southern and central Japan. This research exploits the under-utilised record of volcanic ash layers preserved in dense networks of marine and lake sediment cores away from the volcano. These continuous sediment sequences present unprecedented repositories of ash fall (preserved as visible and microscopic deposits), which are not susceptible to destructive near-source volcanic processes. Using state-of-the-art chemical 'fingerprinting' techniques, it is possible to pinpoint the volcanic source of the distal ash layers, whilst tracing these ash fall events across a network of cores provides the opportunity to computationally model and map past ash dispersals, and calculate eruption magnitudes. Integrating cutting-edge dating techniques (40Ar/39Ar/14C) to date the ash deposits, enables us to reveal the timing and tempo of past explosive eruptions at an individual volcano, and importantly determine the recurrence intervals of widespread hazardous volcanic ash dispersals from these volcanoes. Our research in south and central Japan has been successfully tackling eruption un-reporting and plugged the gaps in the eruption records of numerous volcanoes. In the next phase of our research we will expand the application of our methods to address the volcanoes of NE Japan and the Kurile Arc, utilising newly available marine cores from International Ocean Discovery Programme (IODP) and Japan Agency For Marine-Earth Sciences and Technology (JAMSTEC). In addition, using our ability to produce comprehensive eruption records, we will explore volcano-climate interactions. The distal volcanological records generated in this project will continue to be examined in partnership with those directly responsible for volcanic hazard assessments at individual volcanoes, and policy-makers working in the field.

    more_vert
  • Funder: UK Research and Innovation Project Code: NE/S009736/1
    Funder Contribution: 548,105 GBP

    The Atlantic meridional overturning circulation (AMOC) - part of the so-called 'ocean conveyor belt' - is a key component of Earth's climate system. It involves the northward transport of warm surface waters to the high latitude North Atlantic, where they cool (releasing heat to the atmosphere), sink and flow back southwards at depth. Changes in the AMOC are thought to alter global temperature and precipitation patterns, regional sea-level, and socio-economically important marine ecosystems. There are concerns regarding the strength and stability of AMOC in the future. This is because predicted surface ocean warming and freshening could weaken the formation of dense water that helps drive the AMOC. Earlier research suggests that the AMOC may have different stable states, raising the possibility that the AMOC could rapidly switch to a weaker, or even an 'off', state, having a severe impact on global climate. IPCC models do not predict an abrupt weakening of the AMOC under typical 21st century scenarios; yet there are suggestions that current climate models may be excessively stable. NERC and the international community have invested heavily in monitoring the AMOC, including the implementation of the RAPID array since 2004 and more recently the OSNAP array. Since observations began in 2004, AMOC has weakened at a rate ten times faster than predicted by most models. Yet the extent to which this decline can be attributed to natural multi-decadal variability is uncertain. The limited time span of the RAPID array means we are unable to gain an understanding of the nature of AMOC variability on timescales longer than interannual-to-decadal. Therefore we must turn to geological archives to reconstruct AMOC changes beyond the instrumental record. Yet there are no existing records to provide perspective on recent AMOC variability at multi-decadal and longer timescales. Using recent, novel techniques to constrain past variability, coupled with exceptional sediment archives, ReconAMOC will constrain past AMOC variability on decadal to centennial timescales, generating records for the last 7000 years that will become benchmark constraints on AMOC behaviour. We will focus on the past 7000 years because the climate was not dramatically different to the present day, and remnant glacial ice sheets had melted away so that the major features of deep Atlantic circulation were broadly similar to modern. ReconAMOC deploys a twin approach that utilizes (i) the characteristic subsurface temperature AMOC fingerprint, and (ii) the deep western boundary current response to AMOC change. We have verified these new paleoclimate approaches against variability in the instrumental record and demonstrated their applicability through an extensive pilot study. ReconAMOC is therefore a low risk yet ambitious project, bringing together an international team of collaborators, that will meet a long-sought and much-needed requirement of a wide range of climate scientists and modellers. ReconAMOC will enable testing and improvement of model simulations of AMOC that help facilitate assessment of the vulnerability of the AMOC to climate change, and permit the investigation of the role of AMOC on other components of the climate system. The topics addressed by ReconAMOC are key research targets at national UK (e.g. identified strategic science themes and goals within the NERC strategy) and international (e.g. CMIP6, IMAGESII, SCOR, PAGES, IODP and NSF) levels. Specifically, the ReconAMOC proposal builds on the NERC programmes RAPID, RAPID-WATCH, and RAPID-AMOC, in which interannual to multi-decadal variability in the AMOC is a central focus, as well as NERC programme ACSIS examining interannual to decadal climate variability in the Atlantic.

    more_vert
  • Funder: UK Research and Innovation Project Code: BB/W013770/1
    Funder Contribution: 1,259,580 GBP

    Our vision for this Transition Award is to leverage and combine key emerging technologies in Artificial Intelligence (AI) and Engineering Biology (EB) to enable and pioneer a new era of world-leading advances that will directly contribute to the objectives of the National Engineering Biology Programme. Realisation of the benefits of Engineering Biology technologies is predicated on our ability to increase our capability for predictive design and optimisation of engineered biosystems across different biological scales. Such a scaled approach to Engineering Biology would serve to significantly accelerate translation of scientific research and innovation into applications of wide commercial and societal impact. Synthetic Biology has developed rapidly over the past decade. We now have the core tools and capabilities required to modify and engineer living systems. However, our ability to predictably design new biological systems is still limited, due to the complexity, noise, and context dependence inherent to biology. To achieve the full capability of Engineering Biology, we require a change in capacity and scope. This requires lab automation to deliver high-throughput workflows. With this comes the challenge of managing and utilising the data-rich environment of biology that has emerged from recent advances in data collection capabilities, which include high-throughput genomics, transcriptomics, and metabolomics. However, such approaches produce datasets that are too large for direct human interpretation. There is thus a need to develop deep statistical learning and inference methods to uncover patterns and correlations within these data. On the other hand, steady improvements in computing power, combined with recent advances in data and computer sciences have fuelled a new era of Artificial Intelligence (AI)-driven methods and discoveries that are progressively permeating almost all sectors and industries. However, the type of data we can gather from biological systems does not match the requirements for off-the-shelf ML/AI methods and tools that are currently available. This calls for the development of new bespoke AI/ML methods adapted to the specific features of biological measurement data. AI approaches have the potential to both learn from complex data and, when coupled to appropriate systems design and engineering methods, to provide the predictive power required for reliable engineering of biological systems with desired functions. As the field develops, there is thus an opportunity to strategically focus on data-centric approaches and AI-enabled methods that are appropriate to the challenges and themes of the National Engineering Biology Programme. Closing the Design-Build-Test-Learn loop using AI to direct the "learn" and "design" phases will provide a radical intervention that fundamentally changes the way that we design, optimise and build biological systems. Through this AI-4-EB Transition Award we will build a network of inter-connected and inter-disciplinary researchers to both develop and apply next-generation AI technologies to biological problems. This will be achieved through a combination of leading-light inter-disciplinary pilot projects for application-driven research, meetings to build the scientific community, and sandpits supported by seed funding to generate novel ideas and new collaborations around AI approaches for real-world use. We will also develop an RRI strategy to address the complex issues arising at the confluence of these two critical and transformative technologies. Overall, AI-4-EB will provide the necessary step-change for the analysis of large and heterogeneous biological data sets, and for AI-based design and optimisation of biological systems with sufficient predictive power to accelerate Engineering Biology.

    more_vert
  • Funder: UK Research and Innovation Project Code: EP/X026973/1
    Funder Contribution: 5,997,340 GBP

    Experiments using modern laser technologies and new light sources look at quantum systems undergoing dynamic change to understand molecular function and answer fundamental questions relevant to chemistry, materials and quantum technologies. Typical questions are: How can molecules be engineered for maximum efficiency during energy harvesting, UV protection or photocatalysis? What happens when strong and rapidly changing laser fields act on electrons in atoms and molecules? How fast do qubits lose information due to interactions with the environment? Will an array of interacting qubits in future quantum computers remain stable over long time-scales? Interpreting time-resolved experiments that aim to answer these questions requires Quantum Dynamics (QD) simulations, the theory of quantum motion. QD is on the cusp of being able to make quantitative predictions about large molecular systems, solving the time-dependent Schrödinger equation in a way that will help unravel the complicated signals from state-of-the-art experiments and provide mechanistic details of quantum processes. However, important methodological challenges remain, such as computational expense and accurate prediction of experimental observables, requiring a concerted team-effort. Addressing these will greatly benefit the wider experimental and computational QD communities. In this programme grant we will develop transformative new QD simulation strategies that will uniquely deliver impact and insight for real-world applications across a range of technological and biological domains. The key to our vision is the development, dissemination, and wide adaptation of powerful new universal software for QD simulations, building on our collective work on QD methods exploiting trajectory-guided basis functions. Present capability is, however, held back by the typically fragmented approach to academic software development. This lack of unification makes it difficult to use ideas from one group to improve the methods of another group, and even the simple comparison of QD simulation methods is non-trivial. Here, we will combine a wide range of existing methods into a unified code suitable for use by both computational and experimental researchers to model fundamental photo-excited molecular behaviour and interpret state-of-the-art experiments. Importantly we will develop and implement new mathematical and numerical ideas within this software suite, with the explicit objective of pushing the system-size and time-scale limits beyond what is currently accessible within "standard" QD simulations. Our unified code will lead to powerful and reliable QD methods, simultaneously enabling easy adoption by non-specialists; for the first time, scientists developing and using QD simulations will be able to access, develop and deploy a common software framework, removing many of the inter- and intra-community barriers that exist within the current niche software set-ups across the QD domain. The transformative impact of method development and code integration is powerfully illustrated by electronic structure and classical molecular dynamics packages, used routinely by thousands of researchers around the world and recognised by several Nobel Prizes in the last few decades. Our programme grant aims to deliver a similar step-change by improving accessibility for QD simulations. Success in our programme grant would be the demonstrated increase in adoption of advanced QD simulations across a broad range of end-user communities (e.g. spectroscopy, materials scientists, molecular designers). Furthermore, by supporting a large yet integrated cohort of early-career researchers, this programme grant will provide an enormous acceleration to developments in QD, positioning the UK as a global leader in this domain as we move from the era of classical computation and simulation into the quantum era of the coming decades.

    more_vert

Do the share buttons not appear? Please make sure, any blocking addon is disabled, and then reload the page.

Content report
No reports available
Funder report
No option selected
arrow_drop_down

Do you wish to download a CSV file? Note that this process may take a while.

There was an error in csv downloading. Please try again later.