
Durham University
Durham University
2,099 Projects, page 1 of 420
assignment_turned_in Project2023 - 2027Partners:Durham UniversityDurham UniversityFunder: UK Research and Innovation Project Code: 2887800The objective of my research is to explain the connection between money supply and inflation, which underwent a significant structural change after the 2008 financial crisis. Following the crisis, the BoE, FED, ECB, and other major central banks implemented Quantitative Easing and significantly increased the reserve accounts of commercial banks. The supply of money (M0) in the UK tripled in 2008-2009 and increased by more than six times by 2015. According to the Quantity Theory of Money, this should lead to hyperinflation. However, UK inflation stayed within the target level (below 3%) until 2020 due to a change in monetary transmission performed by commercial banks. To explain the effect of money supply on inflation, we need to study the relationship between the growth of reserves, controlled by the Bank of England, and the money used for economic transactions. Those funds include deposits created within the fractional-reserve banking system. The reserve expansion is only efficient if banks use them and provide credit to the real economy. I intend to uncover the factors that influence this process. First, I will study the effect of credit and liquidity risks on monetary transmission. Banks operating in a riskier environment are more reluctant to provide loans to households and businesses, which slows down monetary transmission. I expect to find quantitative support for this theory by conducting empirical research. Then, I will evaluate different macroprudential policies using the dynamic stochastic general equilibrium (DSGE) framework
All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://beta.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=ukri________::622420a0435df3685461c4ecbabed801&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.eumore_vert All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://beta.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=ukri________::622420a0435df3685461c4ecbabed801&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.euassignment_turned_in Project2024 - 2029Partners:Durham UniversityDurham UniversityFunder: UK Research and Innovation Project Code: EP/X033201/1Funder Contribution: 1,385,240 GBPWhy is it that some computational problems admit algorithms that always work fast, that is, scale up well with the size of data to be processed, while other computational problems are not like this and (appear to) admit only algorithms that scale up exponentially? Answering this question is one of the fundamental goals of Theoretical Computer Science. Computational complexity theory formalises the two kinds of problems as tractable (or polynomial-time solvable) and NP-hard, respectively. So we can rephrase the above question as follows: What kind of inherent mathematical structure makes a computational problem tractable? This very general question is known to be extremely difficult. The Constraint Satisfaction Problem (CSP) and its variants are extensively used towards answering this question for two reasons: on the one hand, the CSP framework is very general and includes a wide variety of computational problems, and on the other hand, this framework has very rich mathematical structure providing an excellent laboratory both for complexity classification methods and for algorithmic techniques. The so-called algebraic approach to the CSP has been very successful in this quest for understanding tractability. The idea of this approach is that certain algebraic structure (which can viewed roughly as multi-dimensional symmerties) in problem instances leads to tractability, while the absence of such structure leads to NP-hardness. This approach has already provided very deep insights and delivered very strong complexity classification results. In particular, it explained which mathematical features distinguish tractable and NP-hard problems within the class of standard CSPs. The proposed research will aim to extend this understanding to Promise Constraint Satisfaction Problems, which is a much larger class of problems, by uncovering deeper mathematical reasons for tractability and NP-hardness, thus providing stronger evidence that tractable problems share a certain algebraic structure. We will also apply our new theory to resolve long-standing open questions about some classical NP-hard optimisation problems, specifically how much the optimality demand must be relaxed there to guarantee tractability.
All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://beta.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=ukri________::8b043cde3fc8ac4ec783b7a6f5e0ceff&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.eumore_vert All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://beta.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=ukri________::8b043cde3fc8ac4ec783b7a6f5e0ceff&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.euassignment_turned_in Project2023 - 2026Partners:Durham UniversityDurham UniversityFunder: UK Research and Innovation Project Code: MR/X006069/1Funder Contribution: 575,703 GBPAstronomers have only observed 5% of the content of the Universe so far: the luminous (or baryonic) matter. The remaining 95% is invisible, consisting of so-called Dark Matter and Dark Energy. The nature of dark matter and dark energy is one of the most pressing fundamental questions in modern physics. This Dark Sector of our Universe has remained impossible to detect directly, because neither component interacts with standard matter particles. Moreover many theories predict dark matter will remain fundamentally undetectable in terrestrial experiments, and can only be probed by astrophysical laboratories. If these theories are correct, dark matter can only be studied where it gathers in sufficient quantities for its gravity to affect things around it we can see. I propose to track the behaviour of dark matter in galaxy clusters (the most massive observable structures in the universe, also called 'cosmic beasts'), to distinguish between the 3 leading models: cold, warm and self-interacting dark matter. My FLF project exploits a dramatic increase over the past decade in observations of galaxy clusters by the world's biggest telescopes, reflecting the field's recognition as a top priority goal. I got awarded observing time on the Hubble Space Telescope in the largest category of programme (>100 orbits) to obtain the deepest ever imaging of clusters' surroundings, plus follow-up spectroscopy from the largest telescope on Earth (VLT). I designed these observations to map clusters' dark matter, via the effect of 'gravitational lensing', which distorts and magnifies objects behind the cluster. I use these data (i) by themselves, (ii) to calibrate the largest (but shallow) Hubble imaging (iii) to set the agenda for, and optimise facilities like Euclid, Athena and the James Webb Space Telescope through the 2020s. Cosmological simulations indicate that galaxy clusters are the best laboratories to distinguish between models of dark matter, because they are still growing. Clusters grow by merging with each other; every merger acts like a gigantic particle collider. The properties of dark matter are revealed by its trajectory through a collision, which should be between that of stars and of (hydrogen) gas. The properties of stars and hydrogen are well understood, so they bookend measurements of dark matter. Traditional research programmes usually separate measurements of dark matter, stars and gas, because they require observations from different (infrared, ultraviolet, X-ray) telescopes. I have developed a multiwavelength analysis, to enable previously impossible measurements such as the time-scale on which dark matter and gas are funneled into clusters, how quickly clusters reach equilibrium, and constraints on possible dark matter particle interactions. I have also led the establishment of a new research area, which I have expanded since the start of my FLF. When transient events (such as supernova explosions) happen behind a galaxy cluster, light from the explosion can be gravitationally lensed and visible along more than one line of sight. Measuring the time delay between multiply-imaged versions of a supernova increases the resolution with which the cluster's dark matter can be mapped. I intentionally scheduled my HST and VLT observations to enable the discovery and monitoring of such events. They also offer the (high risk/high reward) possibility of discovering electromagnetic counterparts to lensed gravitational waves, a new field in which our team has become a leader by developing the theoretical framework and observational strategy to detect the first gravitationally lensed gravitational wave. My UKRI FLF research programme exploits the latest multiwavelength data from world-class facilities. It uses my high-precision techniques to analyse big data, and is interpreted within the world-leading theoretical framework of Durham's state-of-the-art cosmological simulations.
All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://beta.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=ukri________::f7aa2c025da4c4b09182c56b087da6fe&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.eumore_vert All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://beta.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=ukri________::f7aa2c025da4c4b09182c56b087da6fe&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.euassignment_turned_in Project2006 - 2009Partners:Durham UniversityDurham UniversityFunder: UK Research and Innovation Project Code: G0401090/1Funder Contribution: 245,737 GBPThere has been an explosion of knowledge over the past 15 years about how different areas of the human brain contribute to our visual experience of the world. These advances have come from a convergence of several different methodologies: brain imaging, studies of patients with specific kinds of brain damage, and physiological studies of animals. A.D. Milner and M.A. Goodale have summarized these developments in a recent book (Sight Unseen , Oxford University Press, 2003). The present application is directed at combining two of these investigative methods, by not only studying the behaviour and perceptual abilities of certain patients with specific brain damage, but also studying which parts of their brains are active during such tests by means of functional neuroimaging (fMRI). This should help us understand better the functions of the two visual streams, as well as how the brain can adjust to severe damage to one of the two streams.
All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://beta.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=ukri________::9f5534297ba8e9e4535f370163dd687c&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.eumore_vert All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://beta.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=ukri________::9f5534297ba8e9e4535f370163dd687c&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.euassignment_turned_in Project2023 - 2027Partners:Durham UniversityDurham UniversityFunder: UK Research and Innovation Project Code: 2876830The objective of this PhD project is to create a robust framework for simulating quantum field theories (QFTs) by integrating quantum computational methods, tensor network theories (TNT), and machine learning (ML) techniques. This interdisciplinary endeavour aims to mitigate computational challenges inherent in classical simulations of QFTs, paving the way for deeper insights into fundamental physics and high-energy phenomena. Quantum algorithms tailored for QFT simulations on quantum hardware will be developed alongside quantum-classical hybrid algorithms to harness both computational paradigms. The project will implement tensor network decomposition methods to efficiently represent and manipulate states and operators in QFTs, exploring the entanglement structures and devising efficient algorithms for simulating low-dimensional QFTs. Machine learning techniques will be employed to optimize tensor network structures and quantum circuits, as well as for error mitigation to enhance the robustness and accuracy of quantum simulations. Benchmarking against classical methods and existing quantum simulation approaches will validate the developed frameworks. Performance optimization for different quantum hardware architectures will be carried out to investigate the scalability and real and near-term quantum computer performance. The expected outcomes include an optimized framework for QFT simulations leveraging quantum computing, tensor networks, and ML, benchmark results showcasing the performance and accuracy against classical methods, and new insights into the entanglement structure of QFTs. This project has the potential to significantly influence the way QFT simulations are conducted, fostering further innovations at the nexus of quantum computing, machine learning, and high-energy physics. Moreover, the project could be extended to explore applications in other physics areas like condensed matter physics or quantum gravity and employ advanced ML techniques like deep learning or reinforcement learning for further optimization of the simulation framework. The candidate will engage in a cutting-edge interdisciplinary field with extensive potential for theoretical and practical advancements in quantum computing and high-energy physics through this project.
All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://beta.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=ukri________::bbdffd028bced9942176801a404739d8&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.eumore_vert All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://beta.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=ukri________::bbdffd028bced9942176801a404739d8&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.eu