
ESIGELEC
ESIGELEC
5 Projects, page 1 of 1
assignment_turned_in ProjectFrom 2022Partners:Laboratoire d?Innovation Numérique pour les Entreprises et les Apprentissages au service de la Compétitivité des Territoires, Institut National des Sciences Appliquées de Lyon - Laboratoire dIngénierie des Matériaux Polymères, LABORATOIRE DINFORMATIQUE, DE TRAITEMENT DE LINFORMATION ET DES SYSTÈMES - EA 4108, ESIGELECLaboratoire d?Innovation Numérique pour les Entreprises et les Apprentissages au service de la Compétitivité des Territoires,Institut National des Sciences Appliquées de Lyon - Laboratoire dIngénierie des Matériaux Polymères,LABORATOIRE DINFORMATIQUE, DE TRAITEMENT DE LINFORMATION ET DES SYSTÈMES - EA 4108,ESIGELECFunder: French National Research Agency (ANR) Project Code: ANR-21-ASRO-0005Funder Contribution: 299,055 EURThe fields of industry 5.0 and defense are increasingly based on systems of systems where robotic agents must adapt to humans with whom interactions take place. The use of heterogeneous fleets of agents with perception devices is a godsend that allows, after merging individual information, to propose solutions to the problems of optimizing fleet operations, securing the convoy, improving safety and security for human operators, as well as increasing flexibility following the reconfiguration of situations or the environment. The mutualization of information allows to produce a global view of the situation resulting from the individual perceptions of each robotic or non-robotic agent. Each individual perception module produces an interpretation of the scene which is by nature tainted with uncertainty. The consequences of a fleet deployment in complex or hostile environments must also be considered. The communication link required for information exchange is subject to a bandwidth that can be very limited or even non-existent when the link is broken even temporarily. The viewpoint positions required to create the situation view are also dependent on the quality of the location source information when available. The SCOPES project proposes to develop a solution for the production of a situation view augmented by uncertainty as a source of decision information. The contributions of the project will be : - A representation formalism of the situation view, integrating the different sources of uncertainties, allowing an interpretation by humans. - A robust localization method based on the graph paradigm and the semantic information provided by each agent. - A functional specification and associated datasets for objective and quantitative evaluation of collaborative perception situations thanks to the exploitation of the outstanding technological platforms of the project partners. The SCOPES project will lead to TRL 4 level productions. The interest of the project for the economic actors was recognized by the labeling of the project by NAE.
more_vert assignment_turned_in ProjectFrom 2018Partners:CHBRE REGIONALE AGRICULTURE BRETAGN, ESIGELEC, UMR AGROECOLOGIE, STE INNOVATION TECHNO INDUS AVANCEE, CHBRE REG AGRICULT DES PAYS DE LA LCHBRE REGIONALE AGRICULTURE BRETAGN,ESIGELEC,UMR AGROECOLOGIE,STE INNOVATION TECHNO INDUS AVANCEE,CHBRE REG AGRICULT DES PAYS DE LA LFunder: French National Research Agency (ANR) Project Code: ANR-17-ROSE-0002Funder Contribution: 499,877 EURThe evolution of agriculture towards sustainability requires the implementation of new practices to limit herbicides and promote the emergence of systems adapted to social, economic and environmental constraints. To get rid of herbicides, weed management requires systemic strategies involving combined operations at various levels. While inter-row weeding already has alternative solutions, intra-row weeding management requires to better observe, decide and intervene locally in the parcels. Those 3 components (perception/decision/action) are key concepts of sensorimotor loops that are widely used in the Automation domain. Actually, automation control offers an inspiring point of view over weed management, it provides a framework and a set of extensively proven techniques. By designing and applying sensorimotor loops to Weed Management, ROSEAU provide a systemic approach involving multi-scale actions, from autonomous localized intra-row weeding to global robotized interventions design. The main contributions of ROSEAU concerns: - multi-scale weed detection and identification, - weed proliferation simulation and management, - robotic weeding intervention design and optimization, - robust robotic autonomy oriented toward agriculture contexts, - tool-based autonomous navigation, - adaptation, control and exploitation of intra-row mechanical and chemical weeding tools. These contributions are articulated in a coherent framework reinforcing the impact of each of them. With UMR AgroEcologie, IRSEEM, SITIA and Chambres Régionales d’Agricultures Pays de Loire et Bretagne, ROSEAU consortium gather world class references of both Agriculture and TIC domains, with previous challenges victories (IRSEEM at ANR Challenge Argos 2015, 2016), awarded works (UMR Agroécologie silver medal at SIMA 2011) and prolongation of previous successful collaborations between members. ROSEAU is reinforced by a Scientific and Technical Committee of experts in agricultural practices, usages, and humanities.
more_vert assignment_turned_in ProjectFrom 2021Partners:Atmel (France), ESIGELEC, University of Le Havre, Atmo Normandie, SQUADRONE SYSTEM +3 partnersAtmel (France),ESIGELEC,University of Le Havre,Atmo Normandie,SQUADRONE SYSTEM,Institut National de lEnvironnement Industriel et des Risques,INERIS,GROUPE DE RECHERCHE EN ELECTROTECHNIQUE ET AUTOMATIQUE DU HAVREFunder: French National Research Agency (ANR) Project Code: ANR-21-SIOM-0008Funder Contribution: 102,396 EURThe recent assessment carried out by BARPI shows an increasing evolution of accidentology in classified installations in France with a predominance of fires. In order to characterize their possible impact on the environment and the population, it is essential to collect as quickly and reliably as possible the data relating to the event. Measurement and sampling campaigns are then necessary in order to know the substances emitted, the areas of fallout and to build an adequate sampling plan. The DESIHR project (Swarm Drones for the air Monitoring of High Risk Industrial Sites) aims to study the contribution of new technologies to characterize in real situation and more quickly the substances present in a fire plume and their emission conditions in order to carry out predictive mapping of their propagation. It is based on the use of a fleet of autonomous UAVs capable of adapting its flight plan according to the information acquired by each UAV, in order to fulfill two missions which will consist of : - positioning itself in the plume dispersion axis at increasing distances from the source in order to take samples (coupling micro-sensors, automated opening canisters) that can be rapidly analyzed on the ground thanks to a portable GC/MS for gases and soot (granulometry, chemistry, electron microscopy) in the laboratory. The chemical characterizations will allow to evaluate the dilution rate and flows on vertical sections of the plume. - acquire video images of the plume from outside the plume simultaneously from different viewing angles. This information will be transmitted live (crisis cell) so that image processing can be used to determine parameters useful for plume modeling (plume height, volume, section and shape of the plume, etc.). It is broken down into five tasks : - to select the most relevant UAV payloads for the missions and to define the flight strategy with a swarm of UAVs with regard to various constraints (regulatory, aggressive environments, etc.); - to implement algorithms defining the collective actions of the UAVs, the flight controls and the flight plans. This will provide a fleet of autonomous UAVs whose behavior will be based on the data acquired by the payloads integrated into them; - mechanically integrate the capture systems with the UAVs and perform the electronic interfacing of the capture systems; - set up experiments where the generation of artificial model plumes is controlled. This will then provide a controlled and simple framework for evaluating the performance of the payloads used, the performance of the collective actions undertaken by the UAVs, and finally, the relevance of the approach to the final challenges of chemical analysis and modeling; - to carry out a demonstration in a realistic situation undertaken on the SDIS76 exercise platform in Le Havre, the results of which will make it possible to study the possibility of an industrial valorization of the technological solutions proposed by the project. This project is multidisciplinary, bringing together skills in crisis management, robotics, automation, servo-control, mechanics, metrology of atmospheric pollutants, image analysis, atmospheric modelling and air quality. To cover these fields of expertise, six partners are directly involved in the project. A balance has been found between research and development approaches and representation of end-users and distributors. The project will also benefit from the support of the members of the Normandy UAV Innovation Center (CIDN, 7 founding players including: ULHN- NAE - Le Havre Seine Développement) via access to certain shared equipment (UAVs, simulators) and CIDN resources (indoor/outdoor flight zones).
more_vert assignment_turned_in ProjectFrom 2021Partners:Laboratory for Innovations in Sensing, Estimation and Control (LISEC), CNRS, ICL, IBISC, ESIGELEC +7 partnersLaboratory for Innovations in Sensing, Estimation and Control (LISEC),CNRS,ICL,IBISC,ESIGELEC,CRAN,UL,UEVE,University of Paris-Saclay,FAAR SAS,INS2I,CHUFunder: French National Research Agency (ANR) Project Code: ANR-20-CE48-0015Funder Contribution: 514,285 EURThe development of controllers with high performance and reliability for connected and autonomous vehicles (CAVs) will require real-time measurements or estimates of many variables on each vehicle. Examples of variables that are needed for feedback include: longitudinal distances, velocities and accelerations of other nearby vehicles; lateral position of the vehicle in its own lane; vehicle yaw angle; slip angle; yaw rate; steering angle; lateral acceleration; and roll angle. There are also environmental variables which need to be measured such as tire-road friction coefficient, snow cover on road, and the presence of unexpected obstacles. Measurement of all of the above variables requires significant expense. Indeed, some of the sensors above, such as slip angle and roll angle, can be extremely expensive to measure, requiring sensors that cost thousands of dollars. For example the Datron optical sensor for measurement of slip angle has a price over 10k€. In addition, several variables cannot be measured due to unavailability of sensors (at any cost). Furthermore, a CAV requires highly reliable sensors and actuators. Failure of any one sensor or actuator, due to faults, cyber-attacks or denial of service, can cause a disastrous accident. Hence reliable fault diagnostic and fault handling systems are also needed. Such systems cannot be based on hardware redundancy which requires many extra copies of the same sensors. Instead, they need to rely on estimation algorithms and analytical redundancy. For all the above reasons, the development of intelligent estimation algorithms is highly important for autonomous vehicles. Throughout this project we propose original ideas on estimation, which is a necessary and crucial step for reliability, resilience, and safety of CAVs. The overall objectives of the proposal consist in developing efficient estimation algorithms to reconstruct the unmeasurable state variables, which are required to design controllers and fault diagnostic schemes for CAVs. More specifically, the considered issues are safe and stable trajectory, estimation of faults in sensors and actuators, and cyber-attacks detection. We aim to propose a novel approach to tracking vehicles in a platoon and urban roads. The idea we will explore in this project is the development and use of learning-based nonlinear observers. Several components on a vehicle (e.g. tires) have highly complex models whose parameters are difficult to obtain and also vary significantly with time. This proposal will therefore use a modeling approach consisting of a combination of physically meaningful differential equations and adaptive online-learning-based neural networks to represent the vehicle dynamics. In particular, well understood phenomena such as force balances, mechanical motion per Newton's laws, aerodynamic drag, rolling resistance, road grade, combined acceleration terms for lateral and roll accelerations and road bank angle influence will be modeled using analytical differential equations. Tire models for both lateral and longitudinal forces, the friction circle, engine maps, and suspension stiffness and damping characteristics will be modeled using neural networks whose weights can be initially obtained using training via back-propagation. In addition to initial training, model parameters for the neural networks and a subset of parameters for the physically meaningful differential equations will also be updated automatically online during regular vehicle use. More sophisticated and intelligent algorithms will be developed to face sensor faults and disturbances, cyber-attacks, and data-loss. All possible complex architectures of cyber-attacks and data-loss will be investigated. Although this project belongs to fundamental research, experimental developments will be considered through the industrial partner. The partner FAAR will put at the disposal of the project, an innovation platform for the validation of the project’s developments.
more_vert assignment_turned_in ProjectFrom 2023Partners:ESIGELEC, SITIAESIGELEC,SITIAFunder: French National Research Agency (ANR) Project Code: ANR-23-MOXE-0004Funder Contribution: 422,918 EURThe navigation of an autonomous system (robot or vehicle) in an unstructured environment remains an open problem despite the significant progress made in recent years in the field of autonomous vehicles. Recent innovations in the field of perception using Deep-Learning methods suggest solutions that have yet to be put into practice on real applications in complex environments. In the ASTRA (Autonomous System for Terrain Recognition & Adaptation) project, ESIGELEC and SITIA are proposing the development of a system for environmental perception, semantic mapping and navigation adapted to the constraints of difficult real environments: natural, forest or agricultural terrain; unstructured and/or disturbed environments... This system is based on a complete set of sensors (colour and NIR cameras, neuromorphic cameras, LIDAR, RADAR, GNSS-RTK and inertial unit) allowing the acquisition of reliable and rich measurements of the environment. The perceptions from these sensors are then merged by specific algorithms to improve the reliability of the data collected and to compensate for sensor failures or transient defects (occlusions, dust, rain). Finally, a set of higher-level algorithms is based on these merged data to achieve an understanding of the environment (detection of visible or hidden obstacles, qualification of the quality and geometry of the ground); precise localisation that is robust to GNSS failures; and finally, the fully or semi-autonomous navigation of a mobile robot. Man-machine collaboration is not forgotten: several alternative piloting modes are developed, each with a different level of decision-making by the machine, with the objective of a simple and intuitive handover between the human operator and the onboard intelligence. During all phases of the project, the methods and algorithms are developed in a generic and reusable way, so that they can be easily transposed to other types of mobile systems (vehicles, robots of different sizes and with different crossing capacities). This result is guaranteed by the use of 3 robots of different sizes on which the algorithms are tested and validated at each stage of the challenge. The new methods developed during this project will be valorised by a technological transfer to the field of autonomous agricultural robotics, of which SITIA is one of the leaders.
more_vert