Powered by OpenAIRE graph
Found an issue? Give us feedback

Accelrys Limited

Accelrys Limited

15 Projects, page 1 of 3
  • Funder: UK Research and Innovation Project Code: EP/G007489/2
    Funder Contribution: 1,590,540 GBP

    The discovery that matter is made up of atoms ranks as one ofmankind's greatest achievements. Twenty first century science isdominated by a quest for the mastery (both in terms of control andunderstanding) of our environment at the atomic level.In biology, understanding life (preserving it, or even attempting tocreate it) revolves around large, complex, molecules -- RNA, DNA, andproteins.Global warming is dictated by the particular way atoms are arrangedto make small greenhouse gas molecules, carbon dioxide and so on.The drive for faster, more efficient, cheaper computer chips forcesnanotechnology upon us. As the transistors that make up themicroscopic circuits are packed ever closer together, electronicengineers must understand where the atoms are placed, or misplaced, inthe semiconducting and insulating materials.Astronomers are currently, daily, discovering new planets outside oursolar system, orbiting alien stars. The largest are the easiest tospot, and many are far larger than Jupiter. The more massive theplanet the higher pressures endured by the matter that makes up itsbulk. How can we hope to determine the structure of matter at theseconditions?The atomic theory of matter leads to quantum mechanics -- a mechanicsof the every small. In principle, to understand and predict thebehaviour of matter at the atomic scale simply requires the solutionof the quantum mechanical Schroedinger equations. This is a challengein itself, but in an approximate way it is now possible to quicklycompute the energies and properties of fairly large collections ofatoms. But is it possible to predict how those atoms will be arrangedin Nature - ex nihilo, from nothing but our understanding ofphysics?Some have referred to it as a scandal that the physical sciencescannot routinely predict the structure of even simple crystals -- butmost have assumed it to be a very difficult problem. A minimum energymust be found in a many dimensional space of all the possiblestructures. Those researchers brave enough to tackle this challengehave done so by reaching for complex algorithms -- such as geneticalgorithms, which appeal to evolution to breed ever betterstructures (with better taken to mean more stable). However, Ihave discovered to my surprise, and to others', that the very simplestalgorithm -- throw the collection of atoms into a box, and move theatoms downhill on the energy landscape -- is remarkably effectiveif it is repeated many times.This approach needs no prior knowledge of chemistry. Indeed thescientist is taught chemistry by its results -- this is critical ifthe method is to be used to predict the behaviour of matter underextreme conditions, where learned intuition will typically fail.I have used this approach, which I call random structure searching to predict the structure of crystals ex nihilo. My firstapplication of it has been to silane at very high pressures, and thestructure I predicted has recently been seen in experiments. Butprobably the most impressive application so far has been to predictingthe structure of hydrogen at the huge pressures found in the gas giantplanets, where it may be a room temperature superconductor.In the course of my fellowship I will extend this work to try toanticipate the structure of matter in the newly discovered exoplanets,to try to discover and design materials with extreme (and hopefully,extremely useful) properties, and to help pharmaceutical researchersunderstand the many forms that their drug molecules adopt when theycrystallise.

    more_vert
  • Funder: UK Research and Innovation Project Code: EP/G007489/1
    Funder Contribution: 1,360,330 GBP

    The discovery that matter is made up of atoms ranks as one ofmankind's greatest achievements. Twenty first century science isdominated by a quest for the mastery (both in terms of control andunderstanding) of our environment at the atomic level.In biology, understanding life (preserving it, or even attempting tocreate it) revolves around large, complex, molecules -- RNA, DNA, andproteins.Global warming is dictated by the particular way atoms are arrangedto make small greenhouse gas molecules, carbon dioxide and so on.The drive for faster, more efficient, cheaper computer chips forcesnanotechnology upon us. As the transistors that make up themicroscopic circuits are packed ever closer together, electronicengineers must understand where the atoms are placed, or misplaced, inthe semiconducting and insulating materials.Astronomers are currently, daily, discovering new planets outside oursolar system, orbiting alien stars. The largest are the easiest tospot, and many are far larger than Jupiter. The more massive theplanet the higher pressures endured by the matter that makes up itsbulk. How can we hope to determine the structure of matter at theseconditions?The atomic theory of matter leads to quantum mechanics -- a mechanicsof the every small. In principle, to understand and predict thebehaviour of matter at the atomic scale simply requires the solutionof the quantum mechanical Schroedinger equations. This is a challengein itself, but in an approximate way it is now possible to quicklycompute the energies and properties of fairly large collections ofatoms. But is it possible to predict how those atoms will be arrangedin Nature - ex nihilo, from nothing but our understanding ofphysics?Some have referred to it as a scandal that the physical sciencescannot routinely predict the structure of even simple crystals -- butmost have assumed it to be a very difficult problem. A minimum energymust be found in a many dimensional space of all the possiblestructures. Those researchers brave enough to tackle this challengehave done so by reaching for complex algorithms -- such as geneticalgorithms, which appeal to evolution to breed ever betterstructures (with better taken to mean more stable). However, Ihave discovered to my surprise, and to others', that the very simplestalgorithm -- throw the collection of atoms into a box, and move theatoms downhill on the energy landscape -- is remarkably effectiveif it is repeated many times.This approach needs no prior knowledge of chemistry. Indeed thescientist is taught chemistry by its results -- this is critical ifthe method is to be used to predict the behaviour of matter underextreme conditions, where learned intuition will typically fail.I have used this approach, which I call random structure searching to predict the structure of crystals ex nihilo. My firstapplication of it has been to silane at very high pressures, and thestructure I predicted has recently been seen in experiments. Butprobably the most impressive application so far has been to predictingthe structure of hydrogen at the huge pressures found in the gas giantplanets, where it may be a room temperature superconductor.In the course of my fellowship I will extend this work to try toanticipate the structure of matter in the newly discovered exoplanets,to try to discover and design materials with extreme (and hopefully,extremely useful) properties, and to help pharmaceutical researchersunderstand the many forms that their drug molecules adopt when theycrystallise.

    more_vert
  • Funder: UK Research and Innovation Project Code: EP/G055882/1
    Funder Contribution: 309,547 GBP

    Quantum mechanics has had a profound and pervasive influence on science and technology. Phenomena that are intrinsically quantum mechanical, such as magnetism, electron transport in semiconductors, and the effect of impurity atoms in materials, lie at the heart of almost every branch of industry. Quantum mechanical calculations of properties and processes from ``first-principles'' are capable of making accurate quantitative predictions but require solving the Schrdinger equation which is extremely difficult and can only be done using powerful computers. In contrast, empirical modelling approaches are relatively cheap but lack the predictive power of first-principles methods (which are parameter-free and take as input only the atomic numbers of the constituent atoms). The predictive capability is essential, in order to make rapid progress on new and challenging problems where there is insufficient experimental data and to also generate useful empirical approaches or even to check their reliability when these exist. Within the class of first-principles methods, one approach that has been outstandingly successful is the Density Functional Theory (DFT) as it combines high accuracy with moderate computational cost. Nevertheless, the computational effort of performing calculations with conventional DFT approaches increases as the cube of the number of atoms, making them unable to tackle problems with more than a few hundred atoms even on modern supercomputers. Since the pioneering work of the Nobel laureate Walter Kohn, it has been known that it is possible to reformulate DFT so that it scales linearly, which would in principle allow calculations with many thousands or even millions of atoms. The practical realisation of this however, in a method which is as robust and accurate as conventional cubic-scaling DFT approaches has been extremely difficult. The ONETEP approach developed over many years by the applicants of this proposal has achieved just that. ONETEP is at the cutting edge of developments in first principles calculations. However, while the fundamental difficulties of performing accurate first-principles calculations with linear-scaling cost have been solved, only a small core of functionality is currently available in ONETEP which prevents its wide application. In this collaborative project between three Universities, the original developers of ONETEP will lead an ambitious workplan whereby the functionality of the code will be rapidly and significantly enriched. The code development ethic of ONETEP, namely that software is robust, user-friendly, modular, portable and highly efficient on current and future HPC technologies will be of fundamental importance and will be further strengthened by rigorous cross-checking between the three institutions of this proposal. The developments are also challenging from a theoretical point of view as they need to be within the linear-scaling framework of ONETEP, using its highly non-trivial formulation of DFT in terms of in situ optimised localised functions. The program of work provides much added value as the few fundamental enabling technologies that will be developed in its first stages will then underpin many of the functional capabilities that will follow. The result will be a tool capable of a whole new level of materials simulation at the nanoscale with unprecedented accuracy. It will find immediate application in simulations in molecular biology, nanostructures and materials, which underpin solutions in urgent current problems such as energy, environment and health. Through the increasing number of commercial and academic users and developers of ONETEP, the worldwide dissemination and wide use of this novel tool will be rapid; finally the expanding ONETEP Developers' Group will coordinate the best strategies for the future maintenance and development of the software.

    more_vert
  • Funder: UK Research and Innovation Project Code: EP/L015552/1
    Funder Contribution: 4,544,990 GBP

    Moore's Law states that the number of active components on an microchip doubles every 18 months. Variants of this Law can be applied to many measures of computer performance, such as memory and hard disk capacity, and to reductions in the cost of computations. Remarkably, Moore's Law has applied for over 50 years during which time computer speeds have increased by a factor of more than 1 billion! This remarkable rise of computational power has affected all of our lives in profound ways, through the widespread usage of computers, the internet and portable electronic devices, such as smartphones and tablets. Unfortunately, Moore's Law is not a fundamental law of nature, and sustaining this extraordinary rate of progress requires continuous hard work and investment in new technologies most of which relate to advances in our understanding and ability to control the properties of materials. Computer software plays an important role in enhancing computational performance and in many cases it has been found that for every factor of 10 increase in computational performance achieved by faster hardware, improved software has further increased computational performance by a factor of 100. Furthermore, improved software is also essential for extending the range of physical properties and processes which can be studied computationally. Our EPSRC Centre for Doctoral Training in Computational Methods for Materials Science aims to provide training in numerical methods and modern software development techniques so that the students in the CDT are capable of developing innovative new software which can be used, for instance, to help design new materials and understand the complex processes that occur in materials. The UK, and in particular Cambridge, has been a pioneer in both software and hardware since the earliest programmable computers, and through this strategic investment we aim to ensure that this lead is sustained well into the future.

    more_vert
  • Funder: UK Research and Innovation Project Code: EP/N002288/1
    Funder Contribution: 346,710 GBP

    Two of the most critical global challenges currently being faced are energy security and climate change. In the UK, more than £100 bn of investment in new UK power stations and grid infrastructure is projected within the next decade, both to replace ageing plant and to allow for the incorporation of renewable sources. Such changes will involve a paradigm shift in the ways in which we generate and transmit electricity. Since a central element of all items of power plant is electrical insulation, meeting our future challenges through the deployment of new innovative plant, this will require the development and exploitation of new high performance insulation material systems. Polymer nanocomposites have demonstrated clear potential, but the lack of detailed understanding of the underlying physics and chemistry is a major impediment to the technological realisation of this potential. In certain laboratory studies, nanodielectrics materials have out-performed unfilled and traditional micro-composite insulating materials. However, entirely contrary results have also been elsewhere. Undoubtedly, this variability in macroscopic behaviour comes about as a consequence of our inability to define and control the key factors that dictate the dielectric behaviour of nanocomposites. The overarching aim of this project is to resolve this issue such that the potential of dielectric nanocomposites - nanodielectrics - can be fully exploited. As such, the project is totally aligned with the EPSRC Materials for Energy theme in which it is accepted that "in the field of advanced materials it will be necessary to strengthen approaches to the rational design and characterisation of advanced materials and their integration into structures and systems". It also aligns with the Advanced Materials theme of the "Eight Great Technologies", it which it is accepted that "these materials are essential to 21st century manufacturing in a UK market worth £170 billion per annum and representing 15 per cent of GDP". Our research hypothesis is that the macroscopic properties of nanodielectrics cannot be reliably controlled without understanding the processes that occur at the interfaces between the matrix material and the nanoparticles, because these regions directly affect two critical issues. First, interfacial interactions will affect the nanoparticle dispersion, which has a major bearing on many physical properties and, second, the nature of the interface determines the local density of states in the system, and thereby the material's overall electrical characteristics. To understand such local processes is challenging and we propose to do this through a combination of computation simulation and experiment, where both aspects are closely aligned, thereby allowing the simulation to direct experiment and the experimental result to refine the simulation. The work programme has been divided in 3 distinct themes, which will progressively move the work from fundamentals to exploitation. Theme 1 will therefore concentrate on model systems, where simulation and experiment can be most closely aligned. Theme 2 will then seek to deploy the key messages to the development of technologically relevant systems and processes. Throughout, Theme 3 will engage with a range of stakeholders that will range from key industry players (equipment manufacturer s, energy utilities, standards bodies) to the general public t maximise the reach and significance of its ultimate impact (economic, environmental, societal). We see the involvement of our Industrial Users Group as being particularly important, both in helping to guide the project and in terms of ensuring acceptance of the technologies that will ultimately arise.

    more_vert
  • chevron_left
  • 1
  • 2
  • 3
  • chevron_right

Do the share buttons not appear? Please make sure, any blocking addon is disabled, and then reload the page.

Content report
No reports available
Funder report
No option selected
arrow_drop_down

Do you wish to download a CSV file? Note that this process may take a while.

There was an error in csv downloading. Please try again later.