
CENTRE DINNOVATION EN TELECOMMUNICATIONS ET INTEGRATION DE SERVICES
CENTRE DINNOVATION EN TELECOMMUNICATIONS ET INTEGRATION DE SERVICES
13 Projects, page 1 of 3
assignment_turned_in ProjectFrom 2022Partners:Institut National des Sciences Appliquées de Lyon - Laboratoire dIngénierie des Matériaux Polymères, ÉTS, IMDEA, IMDEA Software, UPT +1 partnersInstitut National des Sciences Appliquées de Lyon - Laboratoire dIngénierie des Matériaux Polymères,ÉTS,IMDEA,IMDEA Software,UPT,CENTRE DINNOVATION EN TELECOMMUNICATIONS ET INTEGRATION DE SERVICESFunder: French National Research Agency (ANR) Project Code: ANR-21-CHRA-0004Funder Contribution: 183,590 EURThe energy consumption of mobile networks has been the source of animated debates in the recent period, with the deployment of 5G technologies. However, the energy consumption estimations put forward by the different parties in the debate showed significant differences, up to two orders of magnitude. This is a result of a lack of accurate models and meaningful metrics in this field. More precisely, the control plane of a mobile network represents a significant share of the traffic exchanged between the user and the network infrastructure, much more than in any other network technology, and this role will become even more important with the development of network function virtualisation and orchestration. Models focusing on the application-level traffic and presenting energy consumption as Joules/bit are bound to make harsh approximations and assumptions, leading to results that can not really help the involved parties, be it industrial stake-holders, policy makers or the general public. Project ECOMOME addresses this problem of accurately modelling and optimising the energy consumption of a mobile network, with a focus on 4G and 5G technologies. This will be achieved through three main research axes. The first contribution will be represented by the first independent measurement study of energy consumption in a mobile network. We will address both user equipment and the radio access network, conducting a network metrology study on real operational networks and on experimental testbeds. The measurement data collected in this campaign will represent the input for other contributions in the project, but it will also be made openly available to the research community. The second objective of the project is to use this measurement data in order to design accurate energy consumption models for mobile networks. In this sense, we take an original approach with respect to the literature, by focusing on modelling the impact of the building blocks of the mobile network, a series of "atomic" network mechanisms and functions which practically compose any service scenario and any user context. Modelling these atomic network mechanisms requires a detailed knowledge of the way a mobile network functions, but then allows the accurate modelling of any general scenario. Finally, the project also targets the proposal of energy efficient networking solutions. Indeed, the measurement data and the energy consumption models will allow us to detect the most energy-hungry phases in a mobile network. To reduce their impact, we will propose network intelligence solutions, which are based on observing the traffic transported by the network, detecting whenever the network settings are over-consuming, and adapting the network configuration with energy efficiency metrics in mind. To achieve these objectives, the ECOMOME project brings together 4 partners with a significant expertise on different topics related to mobile networks: cellular network architectures (ETS Montreal), network metrology (INSA Lyon), energy consumption (UP Timisoara) and network intelligence (IMDEA Networks Madrid). The results of the project will have a triple utility: 1) they will provide a new modelling approach and new network intelligence solutions to the academic and industrial community working on mobile networks; 2) they will help policy makers in their decisions regarding the future evolution and deployment of mobile network technologies, and 3) they will allow the general public to easily and intuitively assess the energy consumption of their mobile equipment and of the network infrastructure in a variety of scenarios.
more_vert assignment_turned_in ProjectFrom 2020Partners:Institut National des Sciences Appliquées de Lyon - Laboratoire dIngénierie des Matériaux Polymères, CENTRE DINNOVATION EN TELECOMMUNICATIONS ET INTEGRATION DE SERVICESInstitut National des Sciences Appliquées de Lyon - Laboratoire dIngénierie des Matériaux Polymères,CENTRE DINNOVATION EN TELECOMMUNICATIONS ET INTEGRATION DE SERVICESFunder: French National Research Agency (ANR) Project Code: ANR-19-CE23-0018Funder Contribution: 254,297 EURThe holy grail of Artificial Intelligence (AI)---creating an agent (e.g., software or machine) that comes close to mimicking and (possibly) exceeding human intelligence---remains far off. But past years have seen breakthroughs in agents that can gain abilities from experience with the environment: providing significant advances in the society and the industries including health care, autonomous driving, recommender systems; and ultimately influencing many if not all aspects of everyday life. These advances are partly due to single-agent Deep Learning (DL) along with RL and Monte-Carlo Tree Search (MCTS), i.e., AI research subfields in which the agent can describe its world as a Markov decision process. Some stand-alone planning and RL algorithms are guaranteed to converge to the optimal behavior, as long as the environment, the agent is experiencing, is Markovian and stationary, but scalability remains a significant issue. DL along with RL and MCTS methods have emerged as a powerful combination to break the curse of dimensionality in the face of very large-scale domains at the expend of astronomical data and computational resources, but so far their applicability is mainly restricted to either single-agent domains or sequential games. Today, real-life applications widely use MASs, that is, groups of autonomous, interacting agents sharing a common environment, which they perceive through sensors and upon which they act with actuators. At home, in cities, and almost everywhere, a growing number of sensing and acting machines surround us, sometimes visibly (e.g., robots, drones, cars, power generators) but often imperceptibly (e.g., smartphones, televisions, vacuum cleaners, washing machines). Before long, through the emergence of a new generation of communication networks, most of these machines will be interacting with one another through the internet of things (IoT). Constantly evolving MASs will thus break new ground in coming years, pervading all areas of the society and the industries, including security, medicine, transport, and manufacturing. Although Markov decision processes provide a solid mathematical framework for single-agent planning and RL, they do not offer the same theoretical grounding in MASs. In contrast to single-agent systems, when multiple agents interact with one another, how the environment evolves depends not only upon the action of one agent but also on the actions taken by the other agents, rendering the Markov property invalid and the environment no longer stationary. Also, a centralized (single-agent) control authority is often inadequate because agents cannot (e.g., due to communication cost, latency or noise) or do not want (e.g., in competitive or strategic settings) to share all their information all the time. As a consequence, the increasing penetration of MASs in the society will require a paradigm shift---from single-agent to multi-agent planning and reinforcement learning algorithms---leveraging on recent breakthroughs. That leads us to the fundamental challenge this proposal addresses: the design of generic algorithms with provable guarantees that can efficiently compute rational strategies for a group of cooperating or competing agents in spite of stochasticity and sensing uncertainty, yet using the same algorithmic scheme. Such algorithms should adapt to changes in the environment; apply to different tasks, and eventually converge to a rational solution for the task at hand. But it needs not to exhibit the fastest convergence rates since there is no free lunch. Using the same algorithmic scheme for different problems eases knowledge transfer and dissemination in expert as well as practitioner communities.
more_vert assignment_turned_in ProjectFrom 2021Partners:Institut National des Sciences Appliquées de Lyon - Laboratoire dIngénierie des Matériaux Polymères, Association française pour le nommage Internet en coopération, CENTRE DINNOVATION EN TELECOMMUNICATIONS ET INTEGRATION DE SERVICESInstitut National des Sciences Appliquées de Lyon - Laboratoire dIngénierie des Matériaux Polymères,Association française pour le nommage Internet en coopération,CENTRE DINNOVATION EN TELECOMMUNICATIONS ET INTEGRATION DE SERVICESFunder: French National Research Agency (ANR) Project Code: ANR-20-CYAL-0002Funder Contribution: 327,747 EURThe emerging Internet of Things (IoT) is expected to host billions of devices that regularly report sensor readings by using long or short-range radio channels. Generated content items and also collateral metadata are not well protected today because (i) channel encryption is commonly intercepted at gateways, (ii) identifiers reveal communication partners and contexts, and (iii) cryptographic protection embedded in the low-end transmission infrastructure remains too weak to resist attacks. In the current IoT, sensing sources lose control over their data often even before it reaches its destinations. In PIVOT, we start from two fundamental observations. First, a privacy-friendly IoT requires to protect content objects by themselves, in addition to commonly deployed channel encryption. Content disclosure can thus be attributed to designated receivers. Second, names can serve as the principle interface to access IoT data, eliding source identities. Hence, individual endpoint identifiers will disappear from public Internet metadata. The innovation in PIVOT is to address the prevalent privacy and security issues in IoT by proposing content object security principles that build on privacy-friendly names, while remaining globally and seamlessly interoperable between IoT devices regardless of networks to which they may connect. PIVOT will focus on four core goals: 1. A crypto framework for privacy-friendly service primitives on ultra-constrained IoT devices. 2. Minimal trust anchor provisioning on IoT devices to enable object security. 3. Protocols that integrate decentralized object security. 4. Multi-stakeholder name management that preserves privacy requirements and generates, allocates and resolves names globally regardless of the IoT applications or networks. A demonstrator based on RIOT will verify our solutions in Ultra-Constrained IoT networks such as LoRaWAN. PIVOT follows the perspective of “immediate action is required”. This implies that we will extend existing architectures and protocol standards conjoined with standardization bodies while introducing new first-hand primitive where necessary. Our ambitious roadmap will allow for incremental deployment of the PIVOT solutions, which is the most promising path to quick adoption. The consortium reflects the entire innovation chain and is centred around three established strong German-French collaborations. First, the popular RIOT IoT operating system (OS) was co-founded and is jointly managed by Freie Universität Berlin, INRIA, and HAW Hamburg. Over the past seven years, this team has successfully established a global community for an open IoT. As of today, RIOT has a 5% global IoT OS market share. Second, a strong binational LoRa development community has been established. Afnic, Lobaro, HAW Hamburg, and FU Berlin are part of it. Particular contributions to IoT privacy come from INSA/INRIA in this context. Third, international Internet standardisation is in focus of FU Berlin, HAW Hamburg, and Afnic—they jointly work in the IETF for more than 10 years. In addition to common technical interests, all partners of the German-French team are united in their dedication to an open IoT with the full freedom of data and rights of privacy. Weak IoT security and privacy established their roots in economic factors caused by the tension between costs of security measures and gain from data capitalization. The integration of the four PIVOT goals—based on existing protocols, open software, and open standards—will disengage from economic tensions, reduce cost, promote business compliant to European privacy standards, and regain data sovereignty to the public in the long term. With its open sustainable perspective PIVOT will contribute to changing the trend of Internet consolidation by allowing faster adoption of security and privacy solutions to the IoT, thereby fostering open, trustworthy digitization of our societies.
more_vert assignment_turned_in ProjectFrom 2021Partners:Institut National des Sciences Appliquées de Lyon - Laboratoire dIngénierie des Matériaux Polymères, CENTRE DINNOVATION EN TELECOMMUNICATIONS ET INTEGRATION DE SERVICESInstitut National des Sciences Appliquées de Lyon - Laboratoire dIngénierie des Matériaux Polymères,CENTRE DINNOVATION EN TELECOMMUNICATIONS ET INTEGRATION DE SERVICESFunder: French National Research Agency (ANR) Project Code: ANR-21-CE25-0003Funder Contribution: 291,370 EURThe DRON-MAP project focuses on the use of cooperative UAV networks for pollution plume monitoring in emergency situations (industrial accidents, natural disasters, deliberate terrorist releases, etc.). The deployment of a UAV network in these situations face different scientific and technical challenges such as taking into account the strong plume dynamics, the timely data analysis, the reliable communication and coordination between UAVs and the planning of optimal trajectories. The objective of DRON-MAP project is to address these challenges while proposing a new global and systemic approach. Based on reliable communications and coordination between drones, our approach will federate an instantaneous estimation and a prediction of the plume evolution with efficient anticipatory algorithms of optimal path planning. A network testbed of few communicating UAVs will be set up in order to assess real-world feasibility and performance at a small scale.
more_vert assignment_turned_in ProjectFrom 2018Partners:Grenoble INP - UGA, Institut National des Sciences Appliquées de Lyon - Laboratoire dIngénierie des Matériaux Polymères, CEA Laboratoire d’Electronique et de Technologie de l’Information, Département dInformatique de lEcole Normale Supérieure, Techniques de lInformatique et de la Microélectronique pour lArchitecture des systèmes intégrés +7 partnersGrenoble INP - UGA,Institut National des Sciences Appliquées de Lyon - Laboratoire dIngénierie des Matériaux Polymères,CEA Laboratoire d’Electronique et de Technologie de l’Information,Département dInformatique de lEcole Normale Supérieure,Techniques de lInformatique et de la Microélectronique pour lArchitecture des systèmes intégrés,CENTRE DINNOVATION EN TELECOMMUNICATIONS ET INTEGRATION DE SERVICES,Département d'informatique de l'École normale supérieure,UGA,UJF,CNRS,INS2I,TIMAFunder: French National Research Agency (ANR) Project Code: ANR-18-CE46-0011Funder Contribution: 594,704 EURMost computations on real numbers manipulate them as floating-point numbers. State of the art processor architectures offer functional units supporting the half, single or double precision of the IEEE-754 standard [30]. These formats, of respectively 16, 32 or 64 bits, offer the equivalent of 3, 7 and 15 decimal digits. The reason for the two larger formats is not that programmers need that many digits on the output. Rather, they are useful to protect him from the accumulation and amplification of rounding errors in the intermediate computations. However, the programmer has to make a dramatic choice between these precisions, and then the chosen precision is unlikely to exactly match the needs of the application. At best, it will be overkill, meaning wasted time, memory and power in computing useless bits. At worst, it will be insufficient, meaning numerically wrong results, with possible catastrophic consequences in a world where embedded computing systems interfere more and more with our lives. Considering this, the main claim of this project is the following: accuracy should become a first-class concern in our computing ecosystems currently mainly focused on the cost-performance trade-off. This will lead to better quality numerical software, better trust in their results, but also better performance and power consumption when the accuracy needs are limited. The objective of this project is therefore to add accuracy considerations to cost/performance trade-offs, at all the levels of a computing system: 1. at the hardware level, with better support for lower-than-standard and higher-than-standard precisions, and with hardware support for adaptive precision; 2. at the level of run-time support software, in particular answering the memory management challenges entailed by adaptive precision; 3. at the lower level of mathematical libraries (for instance BLAS for linear algebra), enhancing well established libraries with precision and accuracy control; 4. at the higher level of mathematical libraries (which includes linear solvers such as LAPACK, ad hoc steppers for ordinary differential equations, triangularization problems in computational geometry, etc). This level is characterized by iterative methods where accuracy and precision control of the lower levels will enable higher-level properties such as convergence and stability; 5. at the compiler level, enhancing optimising compilers with novel optimisations related to precision and accuracy; 6. at the language level, embedding accuracy specification and control in existing languages, and possibly defining domain-specific languages with accuracy-aware semantics for some classes of applications. To achieve this goal, the project will focus on specific useful use cases in the domains of linear algebra, computational geometry, and machine learning. The main challenge to address in the lower levels is to offer precision control at an acceptable overhead. For this, the project can build upon the expertise of the project coordinator in hardware and software computing just right, on the expertise in processor integration at LETI, and on the compilation expertise at ENS. On the higher levels, the main challenge is to understand and formalize the accuracy requirements of a computation at each level. There is also a pervasive challenge of designing the relevant interfaces at each level for accuracy and precision control. Defining where the precision can be decided at compile-time, and where it has ti be decided at run-time, is also difficult. We claim that we can address this very difficult challenge for the considered use cases, thanks to the complementary application-domain experience of the project members. The project will develop a demonstrator based on a RISC-V system enhanced with variable-precision hardware, and an accuracy-aware software stack that covers all the levels above.
more_vert
chevron_left - 1
- 2
- 3
chevron_right