Powered by OpenAIRE graph
Found an issue? Give us feedback

British Telecom

149 Projects, page 1 of 30
  • Funder: UK Research and Innovation Project Code: EP/J015520/1
    Funder Contribution: 316,039 GBP

    The Machine-To-Machine (M2M) applications of Wireless Sensor Networks (WSNs) and Wireless Body Area Networks (WBANs) are set to offer many new capabilities in the EPSRC themes of 'Healthcare technologies', 'Living with environmental change' and 'Global uncertainties', granting significant societal and economic benefits. These networks comprise a number of geographically-separated sensor nodes, which collect information from their environment and exchange it using wireless transmissions. However, these networks cannot as yet be employed in demanding applications, because current sensor nodes cannot remain powered for a sufficient length of time without employing batteries that are prohibitively large, heavy or expensive. In this work, we aim to achieve an order-of-magnitude extension to the battery charge-time of WSNs and WBANs by facilitating a significant reduction in the main cause of their energy consumption, namely the energy used to transmit information between the sensor nodes. A reduction in the sensor nodes' transmission energy is normally prevented, because this results in corrupted transmitted information owing to noise or interference. However, we will maintain reliable communication when using a low transmit energy by specifically designing channel code implementations that can be employed in the sensor nodes to correct these transmission errors. Although existing channel code implementations can achieve this objective, they themselves may have a high energy consumption, which can erode the transmission energy reduction they afford. Therefore, in this work we will aim for achieving a beneficial step change in the energy consumption of channel code implementations so that their advantages are maintained when employed in energy-constrained wireless communication systems, such as the M2M applications of WSNs and WBANs. We shall achieve this by facilitating a significant reduction in the supply voltage that is used to power the channel code implementations. A reduction in the supply voltage is normally prevented, because this reduces the speed of the implementation and causes the processed information to become corrupted, when its operations can no longer be performed within the allotted time. However, we will maintain reliable operation when using a low supply voltage, by specifically designing the proposed channel code implementations to use their inherent error correction ability to correct not only transmission errors, but also these timing errors. To the best of our knowledge, this novel approach has never been attempted before, despite its significant benefits. Furthermore, we will develop methodologies to allow the designers of WSNs and WBANs to estimate the energy consumption of the proposed channel code implementations, without having to fabricate them. This will allow other researchers to promptly optimise the design of the proposed channel code implementations to suit their energy-constrained wireless communication systems, such as WSNs and WBANs. Using this approach, we will demonstrate how the channel coding algorithm and implementation can be holistically designed, in order to find the most desirable trade-off between complexity and performance.

    more_vert
  • Funder: UK Research and Innovation Project Code: EP/R006865/1
    Funder Contribution: 6,146,080 GBP

    The smooth functioning of society is critically dependent not only on the correctness of programs, particularly of programs controlling critical and high-sensitivity core components of individual systems, but also upon correct and robust interaction between diverse information-processing ecosystems of large, complex, dynamic, highly distributed systems. Failures are common, unpredictable, highly disruptive, and span multiple organizations. The scale of systems' interdependence will increase by orders of magnitude in the next few years. Indeed by 2020, with developments in Cloud, the Internet of Things, and Big Data, we may be faced with a world of 25 million apps, 31 billion connected devices, 1.3 trillion tags/sensors, and a data store of 50 trillion gigabytes (data: IDC, ICT Outlook: Recovering Into a New World, #DR2010_GS2_JG, March 2010). Robust interaction between systems will be critical to everyone and every aspect of society. Although the correctness and security of complete systems in this world cannot be verified, we can hope to be able to ensure that specific systems, such as verified safety-, security-, or identity-critical modules, are correctly interfaced. The recent success of program verification notwithstanding, there remains little prospect of verifying such ecosystems in their entireties: the scale and complexity are just too great, as are the social and managerial coordination challenges. Even being able to define what it means to verify something that is going to have an undetermined role in a larger system presents a serious challenge. It is perhaps evident that the most critical aspect of the operation of these information-processing ecosystems lies in their interaction: even perfectly specified and implemented individual systems may be used in contexts for which they were not intended, leading to unreliable, insecure communications between them. We contend that interfaces supporting such interactions are therefore the critical mechanism for ensuring systems behave as intended. However, the verification/modelling techniques that have been so effective in ensuring reliability of low-level features of programs, protocols, and policies (and so the of the software that drives large systems) are, essentially, not applied to reasoning about such large-scale systems and their interfaces. We intend to explore this deficiency by researching the technical, organizational, and social challenges of specifying and verifying interfaces in system ecosystems. In so doing, we will drive the use of verification techniques and improve the reliability of large systems. Complex systems ecosystems and their interfaces are some of the most intricate and critical information ecosystems in existence today, and are highly dynamic and constantly evolving. We aim to understand how the interfaces between the components constituting these ecosystems work, and to verify them against their intended use. This research will be undertaken through a collection of different themes covering systems topics where interface is crucially important, including critical code, communications and security protocols, distributed systems and networks, security policies, business ecosystems, and even extending to the physical architecture of buildings and networks. These themes are representative of the problem of specifying and reasoning about the correctness of interfaces at different levels of abstraction and criticality. Interfaces of each degree of abstraction and criticality can be studied independently, but we believe that it will be possible to develop a quite general, uniform account of specifying and reasoning about them. It is unlikely that any one level of abstraction will suggest all of the answers: we expect that the work of the themes will evolve and interact in complex ways.

    more_vert
  • Funder: UK Research and Innovation Project Code: EP/D076870/1
    Funder Contribution: 790,234 GBP

    One of the greatest challenges facing civil engineers in the 21st century is the stewardship of ageing infrastructure. Nowhere is this more apparent than in the networks of tunnels, pipelines and bridges that lie beneath and above the major cities around the world. Much of this infrastructure was constructed more than half a century ago and there is widespread evidence of its deterioration. Tunnels, particularly old ones, are prone to being influenced by activities such as adjacent construction, for instance piling, deep excavations and other tunnel construction. Excessive leakage and pipe bursts are frequent and usually unanticipated. Importantly, underground structures often cannot be inspected when they are being used by trains or due to other physical constraints. The fragility of old infrastructure also presents a challenge for new construction in congested urban environments. Little is known of the long-term performance of such infrastructure. These uncertainties and the importance of safety to users and consumers prompted the initiation of recent research projects investigating the prospect of damage detection and decision making and the use of novel materials to mitigate damage. Advances in the development of innovative sensors such as fibre optic sensors and micro electrical mechanical sensors (MEMS) offer intriguing possibilities that can radically alter the paradigms underlying existing methods of condition assessment and monitoring. Future monitoring systems will undoubtedly comprise Wireless Sensor Networks (WSN) and will be designed around the capabilities of autonomous nodes. Each node in the network will integrate specific sensing capabilities with communication, data processing and power supply. It is therefore the objective of this proposal to demonstrate how large numbers of sensors can be integrated into large-scale engineering systems to improve performance and extend the lifetime of infrastructure, while continuously evaluating and managing uncertainties and risks. This proposal is a joint project between the University of Cambridge and Imperial College London and comprises an integrated research program to evaluate and develop prototype WSN systems. The main objectives of this proposal are to bridge advances in modelling large-scale engineering infrastructure with advances in wireless sensor networks and to develop a low-cost smart sensing environment for monitoring ageing public infrastructure. Three application domains will be studied in detail: (i) monitoring water supply and sewer systems and (ii) monitoring tunnels and (iii) monitoring bridges. The complexity of the monitoring system requires the following research areas to be explored : sensor systems, wireless communications, autonomous systems, information management, programming and design tools, trust security and privacy, systems theory, human factors and social issues. Field trials will be carried out with London Underground Ltd., Thames Water, Highways Agency and Humber Bridge. Intel Corporation will support the project with hardware for the trials.

    more_vert
  • Funder: UK Research and Innovation Project Code: EP/V026259/1
    Funder Contribution: 3,357,500 GBP

    Machine learning (ML), in particular Deep Learning (DL) is one of the fastest growing areas of modern science and technology, which has potentially enormous and transformative impact on all areas of our life. The applications of DL embrace many disciplines such as (bio-)medical sciences, computer vision, the physical sciences, the social sciences, speech recognition, gaming, music and finance. DL based algorithms are now used to play chess and GO at the highest level, diagnose illness, drive cars, recruit staff and even make legal judgements. The possible applications in the future are almost unlimited. Perhaps DL methods will be used in the future to predict the weather and climate, of even human behaviour. However, alongside this explosive growth has been a concern that there is a lack of explainability behind DL and the way that DL based algorithms make their decisions. This leads to a lack of trustworthiness in the use of the algorithms. A reason for this is that the huge successes of deep learning is not well understood, the results are mysterious, and there is a lack of a clear link between the data training DL algorithms (which is often vague and unstructured) and the decisions made by these algorithms. Part of the reason for this is that DL has advanced so fast, that there is a lack of understanding of its foundations. According to the leading computer scientist Ali Rahimi at NIPS 2017: 'We say things like "machine learning is the new electricity". I'd like to offer another analogy. Machine learning has become alchemy!' Indeed, despite the roots of ML lying in mathematics, statistics and computer science there currently is hardly any rigorous mathematical theory for the setup, training and application performance of deep neural networks. We urgently need the opportunity to change machine learning from alchemy into science. This programme grant aims to rise to this challenge, and, by doing so, to unlock the future potential of artificial intelligence. It aims to put deep learning onto a firm mathematical basis, and will combine theory, modelling, data, computation to unlock the next generation of deep learning. The grant will comprise an interlocked set of work packages aimed to address both the theoretical development of DL (so that it becomes explainable) and the algorithmic development (so that it becomes trustworthy). These will then be linked to the development of DL in a number of key application areas including image processing, partial differential equations and environmental problems. For example we will explore the question of whether it is possible to use DL based algorithms to forecast the weather and climate faster and more accurately than the existing physics based algorithms. The investigators on the grant will be doing both theoretical investigations and will work with end-users of DL in many application areas. Mindful that policy makers are trying to address the many issues raised by DL, the investigators will also reach out to them through a series of workshops and conferences. The results of the work will also be presented to the public at science festivals and other open events.

    more_vert
  • Funder: UK Research and Innovation Project Code: EP/K016873/1
    Funder Contribution: 356,078 GBP

    The Internet power consumption has continued to increase over the last decade as a result of a bandwidth growth of at least 50 to 100 times. Further bandwidth growth between 40% and 300% is predicted in the next 3 years as a result of the growing popularity of bandwidth intensive applications. Energy efficiency is therefore increasingly becoming a key priority for ICT organizations given the obvious ecological and economic drivers. In this project we adopt the GreenTouch energy saving target of a factor of a 100 for Core Switching and Routing and believe this ambitious target is achievable should the research in this proposal prove successful. A key observation in core networks is that most of the power is consumed in the IP layer while optical transmission and optical switching are power efficient in comparison, hence the inspiration for this project. Therefore we will introduce energy efficient optimum physical network topologies that encourage optical transmission and optical switching at the expense of IP routing whenever possible. Initial studies by the applicants show that physical topology choices in networks have the potential to significantly reduce the power consumption, however network optimization and the consideration of traffic and the opportunities afforded by large, low power photonic switch architectures will lead to further power savings. We will investigate a large, high speed photonic switch architecture in this project, minimize its power consumption and determine optimum network physical topologies that exploit this switch to minimize power consumption. We will design new large photonic switch fabrics, based on hybrid semiconductor optical amplifiers (SOA) / Mach Zehnder interferometers as gating elements to minimise the switching energy per bit, and plan to optimize the network architecture making use of these new switch architectures and introduce (photonic) chip power monitoring to inform higher layer decisions. Networks are typically over provisioned at present to maintain quality of service. We will study optimum resource allocation to reduce the overprovisioning factor while maintaining the quality of service. Protection is currently provided in networks through the allocation of redundant paths and resources, and for full protection there is a protection route for every working route. We will optimize our networks to minimize power wastage due to protection. The power savings due to optimum physical topology design, optimum resource allocation, optical switching instead of IP routing, more efficient photonic switches and energy efficient protection can be combined and therefore the investigators and their industrial collaborators BT, Alcatel Lucent and Telekomunikacja Polska, believe that an ambitious factor of 100 power saving in core networks can be realised through this project with significant potential for resulting impact on how core photonic networks are designed and implemented.

    more_vert
  • chevron_left
  • 1
  • 2
  • 3
  • 4
  • 5
  • chevron_right

Do the share buttons not appear? Please make sure, any blocking addon is disabled, and then reload the page.

Content report
No reports available
Funder report
No option selected
arrow_drop_down

Do you wish to download a CSV file? Note that this process may take a while.

There was an error in csv downloading. Please try again later.