Powered by OpenAIRE graph
Found an issue? Give us feedback

Maison de la Simulation

6 Projects, page 1 of 2
  • Funder: French National Research Agency (ANR) Project Code: ANR-16-ACHN-0019
    Funder Contribution: 599,999 EUR

    In 1935 Wigner and Huntington speculated that metallization in compressed hydrogen should occur at around 25GPa. Present experiments have reached about 350GPa without finding the metallic phase yet. It is now predicted to occur at 400GPa at low temperature. This mismatch is emblematic of the unexpected and rich physics observed in this pressure range. Several crystalline phases have been detected at low temperature but the molecular dissociation has not been observed. The melting line of the molecular crystal has a reentrant behavior with a maximum temperature of 800K around 100GPa and decreasing at higher pressure, suggesting the presence of a new quantum state of matter, a metallic superfluid or a superconducting superfluid, stabilized by nuclear quantum effects against crystallization. Because of the experimental limitations in reaching higher pressure for hydrogen samples, a new line of thought in searching for hydrogen metallization is to study hydrates compounds comprising H2 molecules hosted in the lattice of heavier elements. Particularly suitable materials appear to be Silane (H4S) and hydrogen sulfide (H2S) which has been very recently found to be a conventional superconductor with a particularly high critical temperature (190K). Very interesting materials are lithium hydrates which are also predicted to favor hydrogen metallization and could be of interest for technological applications Hydrogen is also the most abundant element in the universe, followed by helium, comprising many astronomical objects, e.g. Jupiter and Saturn in our own solar system and a large number of other objects discovered in the past decade. The basic ingredient of planetary models is the equation of state of hydrogen, helium and their mixtures in a wide range of pressure, temperature and concentrations. Of particular importance is to establish the de-mixing transition of the hydrogen-helium mixtures and its interplay with metallization both in hydrogen and helium. The occurrence and location of phase lines might explain experimental observations about the planets. First Principles methods have been widely applied to hydrogen, light elements and hydrates under compression. However standard Density Functional Theory (DFT) methods have difficulties to provide accurate predictions of metallization. Conversely, ground state Quantum Monte Carlo methods have proven to provide reliable predictions even at metallization. An additional difficulty for light elements is the proper account of the nuclear quantum effects which are significant at high pressure. Recently we introduced the Coupled Electron-Ion Monte Carlo (CEIMC), based entirely on Quantum Monte Carlo methods, which overcomes both limitations and is particularly suitable for hydrogen and light elements (helium and lithium) under extreme conditions. So far it has been applied to predict a first order liquid-liquid transition in hydrogen and the principal Hugoniot line in deuterium. Our project is to investigate by CEIMC, metallization in hydrogen and other light elements like helium, lithium and their hydrates compounds, and its interplay with melting and other phase lines. On the methodological side we intend to develop the CEIMC further in the direction of integrating this new method into an open access package which will allow an easier spread of the methodology, and will facilitate its use for systems with heavier elements, where a distinction between core and valence electrons, based on the use of pseudo-potentials, is necessary. For instance the quantitative modeling of water is still very challenging for conventional first-principle methods, while it will be possible to consider electronic correlation, dispersion interactions and quantum nuclei, the three relevant effects, all simultaneously and on the same footing by this new method. Finally we will develop methods for dynamical properties, such as conductivity and response functions based on Quantum Monte Carlo.

    more_vert
  • Funder: French National Research Agency (ANR) Project Code: ANR-12-MONU-0022
    Funder Contribution: 674,391 EUR

    The European Southern Observatory is leading the design phases for the European-Extremely Large Telescope (E-ELT), a 39m diameter telescope, to provide Europe with the biggest eye on the universe ever built, with a first light foreseen in 2022. The E-ELT will be the first telescope that will entirely depend, for routine operations, on adaptive optics (AO), an instrumental technique for the correction of dynamically evolving aberrations in an optical system, used on astronomical telescopes to compensate, in real-time, for the effect of atmospheric turbulence. The two first light instruments: ELT-CAM (a wide-field imager) and ELT-IFU (an integral field spectrograph) are both designed to be coupled to AO modules. The PHASE partnership, gathering most of French AO community is one of the core contributors to the AO modules, being strongly involved in both consortia selected to lead the final design studies for the two first light instruments. The proposed COMputing Platform for Adaptive optics SystemS (COMPASS) shall provide the PHASE community with powerful means to lead the development of both these AO modules as the final design phases should begin in 2012. Based on a total integration of software with hardware and relying on a high performance heterogeneous architecture, the COMPASS platform will be used to perform end-to-end simulations of the AO system behavior and performance as well as to design and test new concepts for the Real-Time Computer (RTC), a core component of any AO system. It will also provide critical decision tools for optimizing the opto-mechanical design of the instruments that will be developed for the E-ELT. The simulation of an AO system involves multiple physics from atmospheric turbulence models to tomographic reconstruction to control theory. Moreover, full length E-ELT simulations are compute-intensive applications and as such good candidates for considering the use hardware accelerators like manycore processors. Among those accelerators, the CUDA hardware was designed to provide graphics processors (GPUs) equipped with HPC-compatible features. The proposed platform will rely on a scalable heterogeneous architecture, based on GPUs as accelerators and using commodity components, able to provide sufficient computing power at a reasonable cost. The main objective of the COMPASS project is to provide a full scale end-to-end AO development platform to the PHASE community, able to address the E-ELT scale and including a real-time core that can be directly integrated on a real system. Additionally, one of the key topics of this project is the development of a prototype for a high speed, low latency, image acquisition and processing system dedicated to AO systems and fully integrated in the simulation framework. The goal of the COMPASS project is to lead developments along four main axis: AO modeling, real-time control for AO, low-latency image acquisition and E-ELT instruments design. While these developments are mainly driven by the E-ELT instrumentation needs, it could have other applications like the real time processing of image streams for detection, recognition and identification in the surveillance and decision-help contexts as in defense, industry, security or medical surgery. This project will federate the efforts of various teams with complementary expertise from high performance computing to adaptive optics systems to astrophysics around a high performance development platform. Spin-offs in each of these domains are expected from such a multi-disciplinary collaboration.

    more_vert
  • Funder: French National Research Agency (ANR) Project Code: ANR-13-MONU-0008
    Funder Contribution: 1,985,730 EUR

    Model simulations are central to the study of complex mechanisms and feedbacks in the climate system and to provide estimates of future and past climate changes. Recent trends in climate modelling are to add more physical components in the modelled system, increasing the resolution of each individual component and the more systematic use of large suites of simulations to address many scientific questions. Climate simulations may therefore differ in their initial state, parameter values, representation of physical processes, spatial resolution, model complexity, and degree of realism or degree of idealisation. In addition, there is a strong need for evaluating, improving and monitoring the performance of climate models using a large ensemble of diagnostics and better integration of model outputs and observational data. High performance computing is currently reaching the exascale and has the potential to produce this exponential increase of size and numbers of simulations. However, post-processing, analysis, and exploration of the generated data have stalled and there is a strong need for new tools to cope with the growing size and complexity of the underlying simulations and datasets. Exascale simulations require new scalable software tools to generate, manage and mine those simulations ,and data to extract the relevant information and to take the correct decision. The primary purpose of this project is to develop a platform capable of running large ensembles of simulations with a suite of models, to handle the complex and voluminous datasets generated, to facilitate the evaluation and validation of the models and the use of higher resolution models. We propose to gather interdisciplinary skills to design, using a component-based approach, a specific programming environment for scalable scientific simulations and analytics, integrating new and efficient ways of deploying and analysing the applications on High Performance Computing (HPC) system. CONVERGENCE, gathering HPC and informatics expertise that cuts across the individual partners and the broader HPC community, will allow the national climate community to leverage information technology (IT) innovations to address its specific needs. Our methodology consists in developing an ensemble of generic elements needed to run the French climate models with different grids and different resolution, ensuring efficient and reliable execution of these models, managing large volume and number of data and allowing analysis of the results and precise evaluation of the models. These elements include data structure definition and input-output (IO), code coupling and interpolation, as well as runtime and pre/post-processing environments. A common data and metadata structure will allow transferring consistent information between the various elements. All these generic elements will be open source and publicly available. The IPSL-CM and CNRM-CM climate models will make use of these elements that will constitute a national platform for climate modelling. This platform will be used, in its entirety, to optimise and tune the next version of the IPSL-CM model and to develop a global coupled climate model with a regional grid refinement. It will also be used, at least partially, to run ensembles of the CNRM-CM model at relatively high resolution and to run a very-high resolution prototype of this model. The climate models we developed are already involved in many international projects. For instance we participate to the CMIP (Coupled Model Intercomparison Project) project that is very demanding but has a high visibility: its results are widely used and are in particular synthesised in the IPCC (Intergovernmental Panel on Climate Change) assessment reports. The CONVERGENCE project will constitute an invaluable step for the French climate community to prepare and better contribute to the next phase of the CMIP project.

    more_vert
  • Funder: French National Research Agency (ANR) Project Code: ANR-15-SPPE-0003
    Funder Contribution: 215,999 EUR

    Exascale challenges the programmer to write multi-level parallel programs, which means employing different paradigms to address multiple levels of parallelism in the system. In the multi-level programming paradigm FP3C targeting post-petascale systems, users are able to express high-level parallelism in the YML workflow language and employ parallel components written in the XcalableMP paradigm. XcalableMP (XMP) aims to combine productivity and performance and is a directive-based PGAS (partitioned global address space) language specified by Japan's PC Cluster Consortium. There it is the main vehicle for research in post-petascale programming models. XMP provides both, a global-view model and a local-view model - the first targeting node-level parallelism, and the latter offering a complete PGAS programming model. The XMP implementation employs MPI as its communication interface. In YML, the YvetteML workflow language is used to describe the parallelism of an application at a very high level. YML provides a compiler to translate the YvetteML notation into XMP-parallel programs, and a just-in-time scheduler managing the execution of parallel programs. It allows to hide low-level communication details from the programmer, particularly when coupling complex applications. Runtime error detection is the most practical approach for correctness checking. By exploiting the MPI profiling interface, the MUST correctness checker currently can detect a wide range of issues in MPI, as well as OpenMP and hybrid programs. The overall goal of this project is to ease programming of future Exascale systems by increasing the programming productivity. To this end we will investigate the application of scalable correctness checking methods to support the YML workflow language, the XMP programming model and selected features of MPI, such as one-sided communication. This includes research on how programming languages and parallelization paradigms could be extended to increase the validity and scalability of automatic correctness checking analyses. In summary, we will address two open research questions: first, which properties of a language or parallelization paradigm are required to enable effective automatic correctness checking and possibly to avoid errors in the first place, and second, how can existing specifications or APIs be extended to provide the necessary semantic information for the correctness checking tool. The development of correctness checking support for XMP will significantly improve the productivity in programming with XMP for Exascale systems. This project continues a French-Japanese collaboration existing for over 10 years and adds the scalable correctness checking support as a new component, delivered by the German partner. The project will result in joint research, publications and software development and consequently is expected to build important assets for future research activities.

    more_vert
  • Funder: French National Research Agency (ANR) Project Code: ANR-16-CE06-0009
    Funder Contribution: 767,187 EUR

    Deep geothermal energy allows clean, non-intermittent, heat and/or power production regardless of weather conditions at any hour of the day or night. It will contribute to the decarbonization of our economy reaching its maximum mitigation potential by 2050 (ANCRE, 2015). However, exploitation of subsurface natural resources is faced with an uncertain environment. This is sometimes coined as the geological risk. Whatever the deep geothermal technology - conventional heat mining of deep aquifers, enhanced/engineered geothermal systems or power production in magmatic settings – and its maturity level, this feature makes geothermal operations high-risk projects with substantial initial investments (several M€) related to drilling costs. Even if insurance policies have recently been adapted to new targets, a single exploration failure may deter operators from a region with assumed good potential but complex geology for decades (e.g. the Hainaut aquifer in Northern France in the early 80’s). Better knowledge of the subsurface is then a key bottleneck for the deployment of deep geothermal technologies. It has been observed that the most efficient way to mitigate the geological risk is the collaborative integration of multidisciplinary data and interpretations into a geomodel of the subsurface. In a geothermal context, the first goal of such conceptual models is the prediction of the spatial distribution of temperature. Then, in order to reach economic profitability, deep geothermal projects need power levels that require convective exchanges with the reservoir at high flow rate through production and injection wells. Parallel to that, transient convective processes, which are ubiquitous in high temperature magmatic settings, also control the temperature distribution and the natural state of many sedimentary basins and basement type geothermal plays. Aforementioned conceptual models must consequently be dynamic by nature and integrate subsurface mass and energy transfers controlled by multiscale geological structures. Numerical simulation has become a powerful method for scientific inquiry on par with experimental and theoretical approaches, especially when data are as scarce and heterogeneous as subsurface data. Moreover, much progress has been made during the last decades in static geological modeling, dynamic geothermal reservoir modeling and performance computing with several contributions from CHARMS’ partners. Yet, many developments are still largely independent and confined to academic circles. There is no off-the-shelf software that integrates all of them in a consistent framework. The main objective of CHARMS is to take that step further and deliver the foundations components of an open framework so that integrated dynamic conceptual models of geothermal systems in complex geological settings can be produced from the early phases of exploration, to increase the probability of success, and evolve continuously through collaborative contributions into operational reservoir models to guarantee sustainable exploitation. The project is based on the three following pillars: • a consistent framework to link evolutionary complex geological models and the definition of the nonlinear physics of geothermal flows, • the improvement of the parallel ComPASS platform, which already has promising results, with numerical schemes tailored to accurately model multiphase multicomponent geothermal flows on unstructured meshes with discontinuities (fault, fractures…), • baseline validation tests and industrial cases, including complex well geometries, to assess the usefulness of the new tools. CHARMS gathers scientists who know each other and have a strong experience in subsurface modeling activities. BRGM (French Geological Survey) will lead the project leveraging the numerical expertise of the University of Nice and Paris 6, and the Maison de la Simulations as well as the industrial experience of Storengy (ENGIE Group).

    more_vert
  • chevron_left
  • 1
  • 2
  • chevron_right
5 Organizations, page 1 of 1

Do the share buttons not appear? Please make sure, any blocking addon is disabled, and then reload the page.

Content report
No reports available
Funder report
No option selected
arrow_drop_down

Do you wish to download a CSV file? Note that this process may take a while.

There was an error in csv downloading. Please try again later.