
Laboratoire informatique, signaux systèmes de Sophia Antipolis
Laboratoire informatique, signaux systèmes de Sophia Antipolis
24 Projects, page 1 of 5
assignment_turned_in ProjectFrom 2023Partners:Laboratoire informatique, signaux systèmes de Sophia AntipolisLaboratoire informatique, signaux systèmes de Sophia AntipolisFunder: French National Research Agency (ANR) Project Code: ANR-22-CE23-0015Funder Contribution: 280,998 EURCEDRO falls into the broad theme of performing decentralized inference (stochastic optimization, estimation, and learning) over graphs. It notably recognizes the increasing ability of many emerging technologies to collect data in a decentralized and streamed manner. Therefore, the focus is on designing decentralized approaches where devices are collecting data in a continuous manner. The project also recognizes that modern machine learning applications (where tremendous volumes of training data are generated continuously by a massive number of heterogeneous devices) have several key properties that differentiate them from standard distributed inference applications. Particular focus will be given to developing and studying approaches for decentralized learning in statistical heterogeneous (multitask) settings in the presence of limited communication resources and heterogeneous system devices. The project emphasis will specifically be on illustrating the interest of the proposed approaches in machine learning frameworks using publicly available datasets.
All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://beta.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=anr_________::5655ada69611b351b417a17f45b2bcee&type=result"></script>'); --> </script>For further information contact us at helpdesk@openaire.eumore_vert All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://beta.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=anr_________::5655ada69611b351b417a17f45b2bcee&type=result"></script>'); --> </script>For further information contact us at helpdesk@openaire.euassignment_turned_in ProjectFrom 2019Partners:JAMSTEC, OSU, Universidade de Sao Paolo, Sorbonne University, Laboratoire informatique, signaux systèmes de Sophia Antipolis +1 partnersJAMSTEC,OSU,Universidade de Sao Paolo,Sorbonne University,Laboratoire informatique, signaux systèmes de Sophia Antipolis,Universidade de Sao PaoloFunder: French National Research Agency (ANR) Project Code: ANR-18-BELM-0003Funder Contribution: 358,236 EURAll Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://beta.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=anr_________::190ef6da330b7fec767068829fd8e504&type=result"></script>'); --> </script>For further information contact us at helpdesk@openaire.eumore_vert All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://beta.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=anr_________::190ef6da330b7fec767068829fd8e504&type=result"></script>'); --> </script>For further information contact us at helpdesk@openaire.euassignment_turned_in ProjectFrom 2024Partners:SCALAB, CNRS, INSHS, SCALAB, Laboratoire informatique, signaux systèmes de Sophia Antipolis +1 partnersSCALAB,CNRS,INSHS,SCALAB,Laboratoire informatique, signaux systèmes de Sophia Antipolis,ETHZFunder: French National Research Agency (ANR) Project Code: ANR-23-CE45-0025Funder Contribution: 389,031 EURThe human perception of a complex visual scene requires a cognitive process of visual attention to sequentially direct the gaze towards a visual region of interest to acquire relevant information selectively through foveal vision, that allows maximal acuity and contrast sensitivity in a small region around the gaze position — whereas peripheral vision allows for a large field of view, albeit with lower resolution, contrast sensitivity. This cognitive process mixes bottom-up attention driven by saliency and top-down attention driven by the demands of the task (recognition, counting, tracking, etc.) While numerous works have investigated visual attention in standard RGB images, it has barely been exploited for the recently developed event sensors (DVS). Inspired by human perception, the interdisciplinary NAMED project aims at designing neuromorphic event-based vision systems for embedded platforms such as autonomous vehicles and robots. A first stage will investigate and develop new bottom-up and top-down visual attention models for event sensors, to focus processing on relevant parts of the scene. This stage will require to understand what drives attention in event data. A second stage will design and implement a hybrid digital-neuromorphic attentive system for ultra-fast, low-latency, and energy-efficient embedded vision. This stage will require to set up a dual vision system (foveal RGB sensor and parafoveal DVS), to design Spiking and Deep Neural Networks, and to exploit a novel system-on-chip developed at ETH Zürich. A last stage will validate and demonstrate the results by applying the robotic operational platform to real-life dynamic scenarios such as autonomous vehicle navigation, ultra-fast object avoidance and target tracking.
All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://beta.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=anr_________::685e8bd00ed8999151cfc144b22e77bb&type=result"></script>'); --> </script>For further information contact us at helpdesk@openaire.eumore_vert All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://beta.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=anr_________::685e8bd00ed8999151cfc144b22e77bb&type=result"></script>'); --> </script>For further information contact us at helpdesk@openaire.euassignment_turned_in ProjectFrom 2021Partners:RENAULT SAS - GUYANCOURT, Laboratoire délectronique antennes et télécommunications, Nice Sophia Antipolis University, LEAT, CENTRE DE RECHERCHE CERVEAU ET COGNITION +3 partnersRENAULT SAS - GUYANCOURT,Laboratoire délectronique antennes et télécommunications,Nice Sophia Antipolis University,LEAT,CENTRE DE RECHERCHE CERVEAU ET COGNITION,Laboratoire informatique, signaux systèmes de Sophia Antipolis,LEAT,RENAULT SAS - GUYANCOURTFunder: French National Research Agency (ANR) Project Code: ANR-20-CE23-0004Funder Contribution: 711,344 EURAutonomous and intelligent embedded solutions are mainly designed as cognitive systems composed of a three step process: perception, decision and action, periodically invoked in a closed-loop manner in order to detect changes in the environment and appropriately choose the actions to be performed according to the mission to be achieved. In an autonomous agent such as a robot, a drone or a vehicle, these 3 stages are quite naturally instantiated in the form of i) the fusion of information from different sensors, ii) then the scene analysis typically performed by artificial neural networks, and iii) finally the selection of an action to be operated on actuators such as engines, mechanical arms or any mean to interact with the environment. In that context, the growing maturity of the complementary technologies of Event-Based Sensors (EBS) and Spiking Neural Networks (SNN) is proven by recent results. The nature of these sensors questions the very way in which autonomous systems interact with their environment. Indeed, an Event-Based Sensor reverses the perception paradigm currently adopted by Frame-Based Sensors (FBS) from systematic and periodical sampling (whether an event has happened or not) to an approach reflecting the true causal relationship where the event triggers the sampling of the information. We propose to study the disruptive change of the perception stage and how event-based processing can cooperate with the current frame-based approach to make the system more reactive and robust. Hence, SNN models have been studied for several years as an interesting alternative to Formal Neural Networks (FNN) both for their reduction of computational complexity in deep network topology, but also for their natural ability to support unsupervised and bio-inspired learning rules. The most recent results show that these methods are becoming more and more mature and are almost catching up with the performance of formal networks, even though most of the learning is done without data labels. But should we compare the two approaches when the very nature of their input-data is different? In the context of interest of image processing, one (FNN) deals with whole frames and categorizes objects, the other (SNN) is particularly suitable for event-based sensors and is therefore more suited to capture spatio-temporal regularities in a constant flow of events. The approach we propose to follow in the DeepSee project is to associate spiking networks with formal networks rather than putting them in competition.
All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://beta.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=anr_________::3dea75a2c00248c04b90d2a1cf1e57fa&type=result"></script>'); --> </script>For further information contact us at helpdesk@openaire.eumore_vert All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://beta.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=anr_________::3dea75a2c00248c04b90d2a1cf1e57fa&type=result"></script>'); --> </script>For further information contact us at helpdesk@openaire.euassignment_turned_in ProjectFrom 2020Partners:LEAT, Laboratoire délectronique antennes et télécommunications, Nice Sophia Antipolis University, RENAULT SW LABS SAS, Laboratoire informatique, signaux systèmes de Sophia Antipolis +4 partnersLEAT,Laboratoire délectronique antennes et télécommunications,Nice Sophia Antipolis University,RENAULT SW LABS SAS,Laboratoire informatique, signaux systèmes de Sophia Antipolis,SYMAG,RENAULT SW LABS SAS,LEAT,Université NIce Sophia Antipolis - Groupe de Recherche en Droit, Economie et GestionFunder: French National Research Agency (ANR) Project Code: ANR-19-CE25-0008Funder Contribution: 689,417 EURThe objectives of the project "Smart IoT for Mobility" (SIM) are to work on IoT-type architectures - so low computing capacity, very low power, small footprint – and to have access to blockchains, Smart Contracts while at the same time being very understandable by the users, accepted by these same users, which are both legally plausible and therefore usable by everyone. The use case that will be taken into account by the project "Smart IoT for Mobility" (SIM) will be the "Smart Services Book" or "advanced maintenance book" of Renault with the main use cases of vehicle accidents. These use cases are extremely studied by all insurers to try to make accident reports tamper-proof. The project "Smart IoT for Mobility" (SIM) will bring together researchers in law, economics, experimental economics, computer science, electronics and will ensure the soundness of its experiences through the partnership with the company Renault Software Labs and the Symag company, subsidiary of BNP Parisbas.
All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://beta.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=anr_________::e8746dff9b1ee8b317da46df900ca7a3&type=result"></script>'); --> </script>For further information contact us at helpdesk@openaire.eumore_vert All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://beta.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=anr_________::e8746dff9b1ee8b317da46df900ca7a3&type=result"></script>'); --> </script>For further information contact us at helpdesk@openaire.eu
chevron_left - 1
- 2
- 3
- 4
- 5
chevron_right