
British Telecommunications plc
British Telecommunications plc
124 Projects, page 1 of 25
assignment_turned_in Project2012 - 2016Partners:Cambridge Silicon Radio Ltd, British Telecom, Nokia Siemens Networks, University of Southampton, BT Group (United Kingdom) +5 partnersCambridge Silicon Radio Ltd,British Telecom,Nokia Siemens Networks,University of Southampton,BT Group (United Kingdom),[no title available],University of Southampton,British Telecommunications plc,Cambridge Silicon Radio Ltd,Nokia Siemens Networks (UK)Funder: UK Research and Innovation Project Code: EP/J015520/1Funder Contribution: 316,039 GBPThe Machine-To-Machine (M2M) applications of Wireless Sensor Networks (WSNs) and Wireless Body Area Networks (WBANs) are set to offer many new capabilities in the EPSRC themes of 'Healthcare technologies', 'Living with environmental change' and 'Global uncertainties', granting significant societal and economic benefits. These networks comprise a number of geographically-separated sensor nodes, which collect information from their environment and exchange it using wireless transmissions. However, these networks cannot as yet be employed in demanding applications, because current sensor nodes cannot remain powered for a sufficient length of time without employing batteries that are prohibitively large, heavy or expensive. In this work, we aim to achieve an order-of-magnitude extension to the battery charge-time of WSNs and WBANs by facilitating a significant reduction in the main cause of their energy consumption, namely the energy used to transmit information between the sensor nodes. A reduction in the sensor nodes' transmission energy is normally prevented, because this results in corrupted transmitted information owing to noise or interference. However, we will maintain reliable communication when using a low transmit energy by specifically designing channel code implementations that can be employed in the sensor nodes to correct these transmission errors. Although existing channel code implementations can achieve this objective, they themselves may have a high energy consumption, which can erode the transmission energy reduction they afford. Therefore, in this work we will aim for achieving a beneficial step change in the energy consumption of channel code implementations so that their advantages are maintained when employed in energy-constrained wireless communication systems, such as the M2M applications of WSNs and WBANs. We shall achieve this by facilitating a significant reduction in the supply voltage that is used to power the channel code implementations. A reduction in the supply voltage is normally prevented, because this reduces the speed of the implementation and causes the processed information to become corrupted, when its operations can no longer be performed within the allotted time. However, we will maintain reliable operation when using a low supply voltage, by specifically designing the proposed channel code implementations to use their inherent error correction ability to correct not only transmission errors, but also these timing errors. To the best of our knowledge, this novel approach has never been attempted before, despite its significant benefits. Furthermore, we will develop methodologies to allow the designers of WSNs and WBANs to estimate the energy consumption of the proposed channel code implementations, without having to fabricate them. This will allow other researchers to promptly optimise the design of the proposed channel code implementations to suit their energy-constrained wireless communication systems, such as WSNs and WBANs. Using this approach, we will demonstrate how the channel coding algorithm and implementation can be holistically designed, in order to find the most desirable trade-off between complexity and performance.
more_vert assignment_turned_in Project2018 - 2024Partners:Meta (Previously Facebook), Facebook, Amazon Web Services (UK), HP Research Laboratories, Methods Group +10 partnersMeta (Previously Facebook),Facebook,Amazon Web Services (UK),HP Research Laboratories,Methods Group,HP Research Laboratories,Methods Group,Amazon Web Services (UK),British Telecom,British Telecommunications plc,GridPP,BT Group (United Kingdom),GridPP,UCL,Hewlett-Packard LtdFunder: UK Research and Innovation Project Code: EP/R006865/1Funder Contribution: 6,146,080 GBPThe smooth functioning of society is critically dependent not only on the correctness of programs, particularly of programs controlling critical and high-sensitivity core components of individual systems, but also upon correct and robust interaction between diverse information-processing ecosystems of large, complex, dynamic, highly distributed systems. Failures are common, unpredictable, highly disruptive, and span multiple organizations. The scale of systems' interdependence will increase by orders of magnitude in the next few years. Indeed by 2020, with developments in Cloud, the Internet of Things, and Big Data, we may be faced with a world of 25 million apps, 31 billion connected devices, 1.3 trillion tags/sensors, and a data store of 50 trillion gigabytes (data: IDC, ICT Outlook: Recovering Into a New World, #DR2010_GS2_JG, March 2010). Robust interaction between systems will be critical to everyone and every aspect of society. Although the correctness and security of complete systems in this world cannot be verified, we can hope to be able to ensure that specific systems, such as verified safety-, security-, or identity-critical modules, are correctly interfaced. The recent success of program verification notwithstanding, there remains little prospect of verifying such ecosystems in their entireties: the scale and complexity are just too great, as are the social and managerial coordination challenges. Even being able to define what it means to verify something that is going to have an undetermined role in a larger system presents a serious challenge. It is perhaps evident that the most critical aspect of the operation of these information-processing ecosystems lies in their interaction: even perfectly specified and implemented individual systems may be used in contexts for which they were not intended, leading to unreliable, insecure communications between them. We contend that interfaces supporting such interactions are therefore the critical mechanism for ensuring systems behave as intended. However, the verification/modelling techniques that have been so effective in ensuring reliability of low-level features of programs, protocols, and policies (and so the of the software that drives large systems) are, essentially, not applied to reasoning about such large-scale systems and their interfaces. We intend to explore this deficiency by researching the technical, organizational, and social challenges of specifying and verifying interfaces in system ecosystems. In so doing, we will drive the use of verification techniques and improve the reliability of large systems. Complex systems ecosystems and their interfaces are some of the most intricate and critical information ecosystems in existence today, and are highly dynamic and constantly evolving. We aim to understand how the interfaces between the components constituting these ecosystems work, and to verify them against their intended use. This research will be undertaken through a collection of different themes covering systems topics where interface is crucially important, including critical code, communications and security protocols, distributed systems and networks, security policies, business ecosystems, and even extending to the physical architecture of buildings and networks. These themes are representative of the problem of specifying and reasoning about the correctness of interfaces at different levels of abstraction and criticality. Interfaces of each degree of abstraction and criticality can be studied independently, but we believe that it will be possible to develop a quite general, uniform account of specifying and reasoning about them. It is unlikely that any one level of abstraction will suggest all of the answers: we expect that the work of the themes will evolve and interact in complex ways.
more_vert assignment_turned_in Project2022 - 2027Partners:The Alan Turing Institute, Microsoft Research Ltd, MICROSOFT RESEARCH LIMITED, MET OFFICE, GlaxoSmithKline PLC +21 partnersThe Alan Turing Institute,Microsoft Research Ltd,MICROSOFT RESEARCH LIMITED,MET OFFICE,GlaxoSmithKline PLC,NHSx,NHSx,Met Office,Dassault Systemes Simulia Corp,Aviva Plc,University of Bath,Dassault Systemes Simulia Corp,GE Healthcare (International),British Telecommunications plc,University of Bath,The Alan Turing Institute,GlaxoSmithKline (Harlow),ADAPTIX LTD,BT Group (United Kingdom),Met Office,Aviva Plc,GE Healthcare Systems France,Radius Diagnostics Ltd,British Telecom,GSK,AdaptixFunder: UK Research and Innovation Project Code: EP/V026259/1Funder Contribution: 3,357,500 GBPMachine learning (ML), in particular Deep Learning (DL) is one of the fastest growing areas of modern science and technology, which has potentially enormous and transformative impact on all areas of our life. The applications of DL embrace many disciplines such as (bio-)medical sciences, computer vision, the physical sciences, the social sciences, speech recognition, gaming, music and finance. DL based algorithms are now used to play chess and GO at the highest level, diagnose illness, drive cars, recruit staff and even make legal judgements. The possible applications in the future are almost unlimited. Perhaps DL methods will be used in the future to predict the weather and climate, of even human behaviour. However, alongside this explosive growth has been a concern that there is a lack of explainability behind DL and the way that DL based algorithms make their decisions. This leads to a lack of trustworthiness in the use of the algorithms. A reason for this is that the huge successes of deep learning is not well understood, the results are mysterious, and there is a lack of a clear link between the data training DL algorithms (which is often vague and unstructured) and the decisions made by these algorithms. Part of the reason for this is that DL has advanced so fast, that there is a lack of understanding of its foundations. According to the leading computer scientist Ali Rahimi at NIPS 2017: 'We say things like "machine learning is the new electricity". I'd like to offer another analogy. Machine learning has become alchemy!' Indeed, despite the roots of ML lying in mathematics, statistics and computer science there currently is hardly any rigorous mathematical theory for the setup, training and application performance of deep neural networks. We urgently need the opportunity to change machine learning from alchemy into science. This programme grant aims to rise to this challenge, and, by doing so, to unlock the future potential of artificial intelligence. It aims to put deep learning onto a firm mathematical basis, and will combine theory, modelling, data, computation to unlock the next generation of deep learning. The grant will comprise an interlocked set of work packages aimed to address both the theoretical development of DL (so that it becomes explainable) and the algorithmic development (so that it becomes trustworthy). These will then be linked to the development of DL in a number of key application areas including image processing, partial differential equations and environmental problems. For example we will explore the question of whether it is possible to use DL based algorithms to forecast the weather and climate faster and more accurately than the existing physics based algorithms. The investigators on the grant will be doing both theoretical investigations and will work with end-users of DL in many application areas. Mindful that policy makers are trying to address the many issues raised by DL, the investigators will also reach out to them through a series of workshops and conferences. The results of the work will also be presented to the public at science festivals and other open events.
more_vert assignment_turned_in Project2013 - 2016Partners:University of Leeds, British Telecommunications plc, University of Leeds, Alcatel-Lucent, Alcatel-Lucent (United States) +2 partnersUniversity of Leeds,British Telecommunications plc,University of Leeds,Alcatel-Lucent,Alcatel-Lucent (United States),British Telecom,BT Group (United Kingdom)Funder: UK Research and Innovation Project Code: EP/K016873/1Funder Contribution: 356,078 GBPThe Internet power consumption has continued to increase over the last decade as a result of a bandwidth growth of at least 50 to 100 times. Further bandwidth growth between 40% and 300% is predicted in the next 3 years as a result of the growing popularity of bandwidth intensive applications. Energy efficiency is therefore increasingly becoming a key priority for ICT organizations given the obvious ecological and economic drivers. In this project we adopt the GreenTouch energy saving target of a factor of a 100 for Core Switching and Routing and believe this ambitious target is achievable should the research in this proposal prove successful. A key observation in core networks is that most of the power is consumed in the IP layer while optical transmission and optical switching are power efficient in comparison, hence the inspiration for this project. Therefore we will introduce energy efficient optimum physical network topologies that encourage optical transmission and optical switching at the expense of IP routing whenever possible. Initial studies by the applicants show that physical topology choices in networks have the potential to significantly reduce the power consumption, however network optimization and the consideration of traffic and the opportunities afforded by large, low power photonic switch architectures will lead to further power savings. We will investigate a large, high speed photonic switch architecture in this project, minimize its power consumption and determine optimum network physical topologies that exploit this switch to minimize power consumption. We will design new large photonic switch fabrics, based on hybrid semiconductor optical amplifiers (SOA) / Mach Zehnder interferometers as gating elements to minimise the switching energy per bit, and plan to optimize the network architecture making use of these new switch architectures and introduce (photonic) chip power monitoring to inform higher layer decisions. Networks are typically over provisioned at present to maintain quality of service. We will study optimum resource allocation to reduce the overprovisioning factor while maintaining the quality of service. Protection is currently provided in networks through the allocation of redundant paths and resources, and for full protection there is a protection route for every working route. We will optimize our networks to minimize power wastage due to protection. The power savings due to optimum physical topology design, optimum resource allocation, optical switching instead of IP routing, more efficient photonic switches and energy efficient protection can be combined and therefore the investigators and their industrial collaborators BT, Alcatel Lucent and Telekomunikacja Polska, believe that an ambitious factor of 100 power saving in core networks can be realised through this project with significant potential for resulting impact on how core photonic networks are designed and implemented.
more_vert assignment_turned_in Project2022 - 2026Partners:British Telecommunications plc, THALES UK LIMITED, British Telecom, SinoWave, SinoWave +13 partnersBritish Telecommunications plc,THALES UK LIMITED,British Telecom,SinoWave,SinoWave,BT Group (United Kingdom),Plextek Ltd,Plextek Ltd,Thales Aerospace,Filtronic Plc,NEC UK Ltd,Qioptiq Ltd,Durham University,Durham University,Thales UK Limited,QinetiQ,Filtronic plc,Filtronic Compound Semiconductors LtdFunder: UK Research and Innovation Project Code: EP/W027151/1Funder Contribution: 786,349 GBPAs mobile radio systems developed, their operating frequency increased to the millimeter (mm) wave band (> 30 GHz) first used in the fifth-generation mobile radio network (5G). Now, as we look beyond 5G, higher frequencies are being considered with increased interest in the 140-170 GHz (termed D-band) and beyond (275 GHz band). At these frequencies, where there is plenty of available spectrum to satisfy the spectrum hungry applications of wireless systems, new designs are required, with little work done in this area world-wide. This proposal brings the complementary expertise of three world leading UK research groups, to research, design and experimentally demonstrate systems working at these frequencies, in an integrative and holistic fashion. For such work, there are three key challenges relating to the radio channel and the signal and system design. Challenge 1: to design wireless communication systems, it is paramount to have a verifiable model of the physical propagation channel by collecting measurement data from a specialist and bespoke designed equipment termed "channel sounder", which sends signals over the air and the receiver measures these signals after propagation. Such a model depends on several physical factors, but mainly the transmission signal parameters e.g. the frequency of transmission, the bandwidth of the signal, and the propagation channel physical parameters, such as the channel size and environment and whether it is indoors or outdoors, environmental factors, presence of obstacles, water moisture, pollution and other factors. Professor Salous and her group at Durham has been building channel sounders for over thirty years and the models she has developed are considered amongst the best in the world, used by regulators, industry and the United Nations through the International Telecommunications Union, (ITU). Professor Salous proposes to design and test new channel sounding in the D Band and at the higher 275 GHz band. These will be unique sounders and the aim is to develop unique models and set the standards for future generation wireless systems. The models will be verified in a practical setting through collaboration with the teams at QMUL and UCL. Challenge 2: The transmission of information at high frequencies requires specialist circuit and equipment design. Whilst there are several circuits for such signals, there are few antennas that can transmit and receive the signals and process them spatially. Professor Yang Hao at QMUL, who has been designing antennas for high frequencies for nearly three decades, will design specialist antennas, to be manufactured using simple 3D printing processes, to integrate to the system designed at Durham for full channel measurements. The designs will be optimized with consultation between the teams and taking the channel models into account. The outcome is a system with multiple antennas that can focus the transmission beams and change their shape and direction (a process termed beam forming) so that a system can be constructed that will fully utilize the benefits of the high frequencies and link to signals addressed by the UCL team. Challenge 3: for the past 20 years the UCL team, led by Professor Darwazeh, has designed and demonstrated the use of specialist signals for mobile and wireless systems that can maximise the amount of information while minimizing the energy required for good signal transmission; these processes are termed spectral and energy efficiencies. UCL will design spectrally and energy efficient signals, based on the D Band channel models derived at Durham and suitable for transmission using the antennas designed by QMUL; the outcome will be a complete transmission system at D Band with projected bit rates beyond 50 Gbit/s; nearly an order of magnitude beyond what can be achieved using 5G systems. The three teams bring strong industrial support to achieve what is predicted to be a world first and which brings interest from all sectors.
more_vert
chevron_left - 1
- 2
- 3
- 4
- 5
chevron_right