Powered by OpenAIRE graph
Found an issue? Give us feedback

JET Propulsion Laboratory

JET Propulsion Laboratory

8 Projects, page 1 of 2
  • Funder: UK Research and Innovation Project Code: NE/V001183/1
    Funder Contribution: 555,021 GBP

    Despite the well-recognised influence of clouds and precipitation on our climate, there are still critical gaps in our ability to observe cloud properties that are needed to test and improve how cloud processes are represented in models. This leads to clouds and aerosols being the biggest source of uncertainty in climate models, according to the IPCC. In addition, uncertainties about cloud processes have important impacts on our ability to predict the weather, because precipitation is produced by clouds, clouds modulate the amount of sunlight we receive during the day and heat we lose at night, and latent heat processes in clouds and precipitation drive dynamical changes in storms. Low-altitude clouds of liquid water droplets cover large swathes of the globe, and cool the earth's climate. However our ability to simulate these clouds in climate models is poor, and the production of drizzle has been identified as a key weakness. We need new observations to unravel the processes in these clouds and improve their representation in simulations. Meanwhile ice clouds cover around one third of the earth at any one time, and provide a net warming on average. However the magnitude of this warming is very uncertain, and their impact on our climate is very sensitive to what we assume about their physics. Thus we urgently need to constrain those physical processes controlling how ice particles evolve in natural clouds. Finally, stratiform precipitation is an important component of the hydrological cycle and the radiation budget. Typically such precipitation include an ice phase aloft and a liquid phase at lower altitude. Yet there are processes in both phases which remain uncertain, and require new observations to robustly constrain them. Our novel proposal exploits new radar technology to break through the current limitations on the information we can currently retrieve about cloud properties and the processes that drive the evolution of the hydrometeors within them. With the help of our project partners at the Met Office and the ECMWF we will use this information to improve the simulation of cloud processes in weather and climate forecasts. In 2018 the UK Space Agency and Centre for Earth Observation Instrumentation agreed to fund the development of a new 200 GHz (G-band) Doppler radar system, called GRaCE, led by investigators Huggard and Battaglia. This ground-breaking demonstrator instrument will collect its first data at the Chilbolton Observatory early in 2020, and will be able to penetrate multiple layers of clouds with unprecedented sensitivity to small sub-millimetre particle thanks to the radar 1.5 mm wavelength, the smallest for any cloud radar system worldwide. The radar will be operated for 22 months in synergy with a suite of other remote sensing instruments. The unprecedented dataset will be exploited by GRACES scientists who are leaders in radar remote sensing techniques and have spearheaded retrieval techniques for multi-wavelength Doppler radars. Vertical profiles of cloud physical properties including water content as well as drizzle drop and ice crystal size distributions will be obtained and this data will be used to test the representation of cloud processes in numerical models in much greater detail than has been possible before. Through this leap forward in our ability to observe clouds the GRACES system will become the forerunner for future development of a new stream of ground-based remote sensing instruments, greatly strengthening the current Earth observing system. The high frequency of the radar means that it will also be suitable for development into air-borne/space-borne instruments for cloud related studies, and indeed the proposal is very timely given parallel efforts at NASA's JPL to build an airborne differential absorption radar (for measuring water vapour) at smaller frequencies (165 to 173 GHz), and to develop CubeSat radars in the G-band (see NASA-JPL's LoS).

    more_vert
  • Funder: UK Research and Innovation Project Code: EP/R02572X/1
    Funder Contribution: 11,588,400 GBP

    Nuclear facilities require a wide variety of robotics capabilities, engendering a variety of extreme RAI challenges. NCNR brings together a diverse consortium of experts in robotics, AI, sensors, radiation and resilient embedded systems, to address these complex problems. In high gamma environments, human entries are not possible at all. In alpha-contaminated environments, air-fed suited human entries are possible, but engender significant secondary waste (contaminated suits), and reduced worker capability. We have a duty to eliminate the need for humans to enter such hazardous environments wherever technologically possible. Hence, nuclear robots will typically be remote from human controllers, creating significant opportunities for advanced telepresence. However, limited bandwidth and situational awareness demand increased intelligence and autonomous control capabilities on the robot, especially for performing complex manipulations. Shared control, where both human and AI collaboratively control the robot, will be critical because i) safety-critical environments demand a human in the loop, however ii) complex remote actions are too difficult for a human to perform reliably and efficiently. Before decommissioning can begin, and while it is progressing, characterization is needed. This can include 3D modelling of scenes, detection and recognition of objects and materials, as well as detection of contaminants, measurement of types and levels of radiation, and other sensing modalities such as thermal imaging. This will necessitate novel sensor design, advanced algorithms for robotic perception, and new kinds of robots to deploy sensors into hard-to-reach locations. To carry out remote interventions, both situational awareness for the remote human operator, and also guidance of autonomous/semi-autonomous robotic actions, will need to be informed by real-time multi-modal vision and sensing, including: real-time 3D modelling and semantic understanding of objects and scenes; active vision in dynamic scenes and vision-guided navigation and manipulation. The nuclear industry is high consequence, safety critical and conservative. It is therefore critically important to rigorously evaluate how well human operators can control remote technology to safely and efficiently perform the tasks that industry requires. All NCNR research will be driven by a set of industry-defined use-cases, WP1. Each use-case is linked to industry-defined testing environments and acceptance criteria for performance evaluation in WP11. WP2-9 deliver a variety of fundamental RAI research, including radiation resilient hardware, novel design of both robotics and radiation sensors, advanced vision and perception algorithms, mobility and navigation, grasping and manipulation, multi-modal telepresence and shared control. The project is based on modular design principles. WP10 develops standards for modularisation and module interfaces, which will be met by a diverse range of robotics, sensing and AI modules delivered by WPs2-9. WP10 will then integrate multiple modules onto a set of pre-commercial robot platforms, which will then be evaluated according to end-user acceptance criteria in WP11. WP12 is devoted to technology transfer, in collaboration with numerous industry partners and the Shield Investment Fund who specialise in venture capital investment in RAI technologies, taking novel ideas through to fully fledged commercial deployments. Shield have ring-fenced £10million capital to run alongside all NCNR Hub research, to fund spin-out companies and industrialisation of Hub IP. We have rich international involvement, including NASA Jet Propulsion Lab and Carnegie Melon National Robotics Engineering Center as collaborators in USA, and collaboration from Japan Atomic Energy Agency to help us carry out test-deployments of NCNR robots in the unique Fukushima mock-up testing facilities at the Naraha Remote Technology Development Center.

    more_vert
  • Funder: UK Research and Innovation Project Code: EP/P030181/1
    Funder Contribution: 6,160,540 GBP

    Optical fibres lie at the heart of our increasingly technological society, for example: supporting the internet and mobile communications that we all now take for granted, saving lives through medical diagnosis and interventions using fibre-optic endoscopes, and enabling the mass production of a huge array of commercial products through fibre laser based materials processing. However, current fibre optics technology has its limitations due largely to the fact that the light is confined to a solid glass core. This places fundamental restrictions on the power and wavelength range over which signals can be transmitted, the speed at which signals propagate, and in terms of sensitivity to the external environment. These limits are now starting to impose restrictions in many application areas. For example, in telecommunications, nonlinear interactions between wavelength channels limit the maximum overall data transmission capacity of current single mode fibres to ~100-200 Tbit/s (for amplified terrestrial systems). Moreover, nonlinear, thermal and material damage thresholds combine to limit the maximum peak and average powers that can be delivered in a tightly focusable beam. This restricts the range of potential uses, particularly in the important ultrashort pulse regime increasingly used for a wide variety of materials processing applications These limitations can in principle be overcome by exploiting new light guidance mechanisms in fibres with a hollow core surrounded by a fine glass microstructure. Such fibres are generally referred to as Hollow Core Fibres (HCFs). Within this Programme we will seek to reinvent fibre optics technology and will replace the glass core with air or vacuum to produce Optical Fibres 2.0, offering vastly superior but largely unexplored potential. Our ultimate vision is that of a Connected World, where devices, machines, data centres and cities can be linked through these hollow light pipes for faster, cheaper, more resilient and secure communications. A Greener and Healthier World, where intense laser light can be channelled to produce goods and run combustion engines more efficiently and to image cancer tissues inside our bodies in real time. And an Explorative World, where hollow lightguides will enable scientific breakthroughs in attosecond science, particle physics, metrology and interplanetary exploration. Our overall ambition is therefore to revisit the way we think about light guidance and to develop a disruptive technology that challenges conventional thinking. The programme will provide the UK with a world-leading position both in HCF technology itself and in the many new applications and services that it will support.

    more_vert
  • Funder: UK Research and Innovation Project Code: NE/N018508/1
    Funder Contribution: 1,802,010 GBP

    We propose a large-scale, multi-faceted, international programme of research on the functioning of the Earth system at a key juncture in its history - the Early Jurassic. At that time the planet was subject to distinctive tectonic, magmatic, and solar system orbital forcing, and fundamental aspects of the modern biosphere were becoming established in the aftermath of the end-Permian and end-Triassic mass extinctions. Breakup of the supercontinent Pangaea was accompanied by creation of seaways, emplacement of large igneous provinces, and occurrence of biogeochemical disturbances, including the largest magnitude perturbation of the carbon-cycle in the last 200 Myr, at the same time as oceans became oxygen deficient. Continued environmental perturbation played a role in the recovery from the end-Triassic mass extinction, in the rise of modern phytoplankton, in preventing recovery of the pre-existing marine fauna, and in catalysing a 'Mesozoic Marine Revolution'. However, existing knowledge is based on scattered and discontinuous stratigraphic datasets, meaning that correlation errors (i.e. mismatch between datasets from different locations) confound attempts to infer temporal trends and causal relationships, leaving us without a quantitative process-based understanding of Early Jurassic Earth system dynamics. This proposal aims to address this fundamental gap in knowledge via a combined observational and modelling approach, based on a stratigraphic 'master record' accurately pinned to a robust geological timescale, integrated with an accurate palaeoclimatic, palaeoceanographic and biogeochemical modelling framework. The project has already received $1.5M from the International Continental Drilling Programme towards drilling a deep borehole at Mochras, West Wales, to recover a new 1.3-km-long core, representing an exceptionally expanded and complete 27 My sedimentary archive of Early Jurassic Earth history. This core will allow investigation of the Earth system at a scale and resolution hitherto only attempted for the last 65 million years (i.e. archive sedimentation rate = 5 cm/ky or 20 y/mm). We will use the new record together with existing data and an integrative modelling approach to produce a step-change in understanding of Jurassic time scale and Earth system dynamics. In addition to order of magnitude improvements in timescale precision, we will: distinguish astronomically forced from non-astronomically forced changes in the palaeoenvironment; use coupled atmosphere-ocean general circulation models to understand controls on the climate system and ocean circulation regime; understand the history of relationships between astronomically forced cyclic variation in environmental parameters at timescales ranging from 20 kyr to 8 Myr, and link to specific aspects of forcing relating to solar energy received; use estimated rates and timing of environmental change to test postulated forcing mechanisms, especially from known geological events; constrain the sequence of triggers and feedbacks that control the initiation, evolution, and recovery from the carbon cycle perturbation events, and; use Earth system models to test hypotheses for the origins 'icehouse' conditions. Thirty six project partners from 13 countries substantially augment and extend the UK-based research.

    more_vert
  • Funder: UK Research and Innovation Project Code: EP/P017487/1
    Funder Contribution: 1,398,050 GBP

    This project addresses the problem of "characterisation" of Extreme Environments (EE), by deploying and combining information from a variety of different Remote Sensing modalities. Our principle application area is nuclear decommissioning, however our research outputs will be relevant to other EE. Before nuclear decommissioning interventions can happen, the facility/plant being decommissioned must be "characterised", to understand: physical layout and 3D geometry; structural integrity; contents including particular objects of interest (e.g. fuel rod debris). 3D plant models must further be annotated with additional sensed data: thermal information; types/levels/locations of contamination (radiological, chemical etc.). Characterisation may be needed before, during or after POCO (Post Operation Clean Out). "Quiescent buildings" may be over half a century old, with uncertain internal layout and contents. Characterisation is needed in dry environments (e.g. contaminated concrete "caves") and wet environments (e.g. legacy storage ponds). Caves may be unlit, causing difficult vision problems (shadows, contrast, saturation) with robot-mounted spotlights. Underwater environments cause significant visibility degradation for RGB cameras, and render most depth/range sensors unusable. New technologies, e.g. acoustic cameras, engender interesting new challenges in developing algorithms to process these new kinds of image data. In many cases, robots are needed to deploy Remote Sensors into Extreme Environments and move them to desired locations and viewing poses. In some cases, robots must also assist characterisation by retrieving samples of contaminated materials. In many case real-time Remote Sensing data must also be applied to inform and control the actions of robots, while performing remote intervention tasks in EE. This project brings together a unique, cross-disciplinary and international team of researchers and institutes, spanning three continents, to address these challenges. End-users NNL and JAEA will advise on scenarios and challenges for Remote Sensing in nuclear environments. Active facilities at JPL will be used to measure degradation of sensors, chips and software under a variety of radiation types and doses. JPL and Essex researchers will use this data to develop new models for predicting such degradation. Essex researchers will then develop new methods for software and embedded hardware design, which overcome radiation damage by incorporating new approaches to fault detection, tolerance and recovery. The scenarios provided by the partners, and the degradation data measured by JPL, will be used to develop new benchmark data-sets comprising data from multiple sensing modalities (RGB cameras, depth/range cameras, IR thermal imaging, underwater acoustic imaging), featuring a vairiety of nuclear scenes and objects. UoB and Essex researchers will develop new algorithms for real-time 3D characterisation of scenes, with intelligent and adaptive fusion of multiple sensing modalities. First, new multi-sensor fusion methods will be developed for 3D modelling, semantic/meta-data labelling, recognition and understanding of scenes and objects. Second, these methods will be extended to incorporate new algorithms for overcoming extreme noise and other kinds of degradation in images and sensor data. Third, we will develop the robots and robot control methods needed to: i) deploy remote sensors into extreme environments; ii) exploit remote sensor data to guide robotic interventions and actions in these environments. Finally, we will carry out experimental deployments of these new technologies. Robust hardware and software solutions, developed by Essex, will be tested in active radiation environments at JPL. We will also carry out experimental robotic deployments of sensor payloads into inactive but plant-representative nuclear environments at NNL Workington and the Naraha Fukushima mock-up testing facilities in Japan.

    more_vert
  • chevron_left
  • 1
  • 2
  • chevron_right

Do the share buttons not appear? Please make sure, any blocking addon is disabled, and then reload the page.

Content report
No reports available
Funder report
No option selected
arrow_drop_down

Do you wish to download a CSV file? Note that this process may take a while.

There was an error in csv downloading. Please try again later.