
ICL
44 Projects, page 1 of 9
- CRAN,UL,INS2I,PREDICT Nancy,ICL,CHU,CNRSFunder: French National Research Agency (ANR) Project Code: ANR-15-LCV1-0005Funder Contribution: 300,000 EUR
The modernization of industrial systems is argued within the "Factory of the Future (FoF)" strategic plan as a necessary lever to improve competitiveness and maintain industrial employment in France. It raises major challenges of innovation that the PHM (Prognostics and Health Management) defended as a new scientific discipline initiated by NASA, is a major focus. PHM is structured on key processes such as monitoring / predictive diagnosis, prognostics and aided decision-making. These processes aim at characterizing the drifts and degradations of a system and prevent/anticipate its failures. While standards are created to formalize these processes, the PHM is not actually deployed in companies. In that way, ad hoc software solutions available are too limited, proprietary and not designed from a scientific framework enough consistent to ensure genericity. This scientific challenge is addressed with the research strategy of CRAN Laboratory (UMR CNRS 7039, University of Lorraine). The CRAN is recognized nationally and internationally as a major actor of PHM. It is initiator for founding many works on the diagnosis, prognosis, and predictive maintenance but also on their engineering in a system vision. These works are validated at laboratory level but not enough at the company one due to missing of suitable tools supporting the works transfer to an industrial scale. The PREDICT SME is a pioneer in the market of PHM tools in France because it is selling an offer built on CASIP / KASEM and CASIP Engineering platforms with the associated services required to configure these platforms. Nevertheless these solutions are mainly implemented within complex - unitary industrial systems whereas growth markets are moving towards application areas in which the systems have smaller size and complexity (e.g. machine tool, tractors, and construction equipment). It implies for the economic sustainability of PREDICT and the increasing of its overall leadership in PHM solutions, to adapt its offer with a reduction, in the total cost, by a factor of 10 to 100. This rationalization based on a modularization of PHM technologies is the key to open both the SME market / ETIs and the export market within a volume of sales adapted to these new markets. PREDICT must therefore innovate in three complementary scientific areas: models and algorithms to support PHM processes; the engineering to deploy these processes in the case of a specific system, and finally, the technologies to support the deployment. These technologies could be, in a short term, cyber-physical technologies built on a software part (PHM algorithms) that interacts with a hardware part such as Single-Board Computer, Plug computer or SmartPhone. This report on PHM about technology and scientific needs, added with the complementary skills between CRAN and PREDICT, are the genesis of this LabCOM proposal called PHM-FACTORY (or manufacture of cyber physical PHM technologies). It can be seen as an innovation and research platform for which the roadmap is based on 3 phases: consolidation (e.g. short-term transfer), expansion (e.g. new functionalities) and exploration (e.g. prospecting for new services). The roadmap will address issues such as knowledge operationalization, hybrid approaches, self-adaptive algorithms ... Expected on the 3 axes (models / engineering / technology), innovations will be realized by "software components (COTS)" and a specialist engineering enabling the development of innovative PHM solutions. The results will also benefit to the industrial and academic communities in strengthening the foundations of the PHM discipline and its effectiveness in business and operational context. Finally this LabCOM is a real opportunity to increase and sustain the already successful collaboration established between PREDICT and CRAN (from several years ago in terms of projects, co-authored publications …) in a stable institutional framework conducting to the emergence of R&D.
more_vert assignment_turned_in ProjectFrom 2024Partners:CHU, UL, CRAN, INS2I, University of Maryland, Baltimore County +2 partnersCHU,UL,CRAN,INS2I,University of Maryland, Baltimore County,ICL,CNRSFunder: French National Research Agency (ANR) Project Code: ANR-23-CE94-0001Funder Contribution: 328,154 EURNowadays there is an increasing availability of multiple and complementary datasets associated with a given problem, and the main challenge is the extraction of features that are most useful and relevant for the given task. This is generally achieved considering source mixing models where the components (sources) are associated with quantities of interest. Since usually, very little is known about the actual interaction among the datasets, it is highly desirable to minimize the underlying assumptions when estimating the sources. This has been the main reason for the growing importance of joint matrix and tensor decompositions, as they not only enable full interaction among the datasets but also yield factor matrices that are directly interpretable. An effective way to capture the inherent relationships among the samples of multiple datasets is to make use of appropriate statistical models. Independent vector analysis (IVA) enables such a powerful formulation with general uniqueness guarantees for matrix decompositions. Another effective approach, primary based on algebraic arguments, uses tensors to take the multi-way structure of the data into account. Tensors enable unique identifiability by naturally constraining the mixing model. A crucial aspect when dealing with multiple data is the large number of datasets, which can easily reach 10s of thousands or more. When the number of datasets grows, an important challenge is how to best summarize the information while making sure that the features that relate to individual variability within each dataset are preserved. Identification of homogeneous subspaces, where the components within a subspace are highly related (correlated/dependent) is an effective way to summarize the heterogeneity in large datasets. This is the argument behind low-rank models, but, with a large number of datasets, such subspaces should be defined across subsets rather than all the datasets in the decomposition. Hence, this is an important challenge for tensor methods, which can be readily scalable for large datasets. On the other hand, for the IVA, where this information is directly captured through a multivariate probability density model, scalability becomes a major concern when the number of datasets increases. Hence, there are unique advantages and challenges for each approach, each constituting a different way to represent and work with multiset data. The methodology developed in this proposal targets multiple large spatio-temporal datasets, i.e., datasets acquired across spatial and temporal dimensions. Spatio-temporal data arises in many domains (neuroscience, environmental science, social media, traffic dynamics, etc.) and the aim is to develop a unified and rigorous framework for extracting homogeneous subgroups and features from such data. In a first stage we develop a set of powerful methods for extracting features through identification of homogeneous subgroups in large datasets, with two powerful approaches for spatio-temporal data: (i) a statistically motivated matrix decomposition framework based on IVA, and (ii) coupled tensor decompositions with shared and dataset-specific components. Then, in a second stage we establish the connections between these two approaches, both in terms of methods and uniqueness conditions, and develop a methodology for subgroup identification. Finally, we will apply the developed methodology to fMRI data, and more specifically, to the Adolescent Brain Cognitive Development (ABCD) Study, a comprehensive longitudinal data from a national and diverse cohort of almost 12K children ages 9 - 10 followed throughout adolescence.
more_vert - INS2I,CHU,CRAN,CNRS,ICL,ULFunder: French National Research Agency (ANR) Project Code: ANR-12-JS03-0004Funder Contribution: 187,049 EUR
For about 30% of the patients suffering from focal epilepsy, pharmacological treatments appear to be inefficient and we need to resort to surgery. The procedure consists in first localizing the epileptogenic zone via a comprehensive examination which takes into account neurological examination, fMRI, EEG, SEEG or MEG. Afterwards, we proceed to the excision provided it will not have a major impact on the patient’s life. Surgery is efficient for only a quarter of drug-resistant patients. It is therefore essential to bring fresh insights into the seizures pathophysiology in order to open the way to novel localization paradigms which would increase the surgery success rate and potentially lead to novel treatments. In this basic research project, we want to develop and analyze patient-specific dynamical models of the epileptic network which reproduce realistic intracranial EEG activities for patients suffering from the most frequent intractable epilepsies: lobe temporal epilepsies. The model will be used to validate neurophysiological assumptions on the seizure-causing factors and might cast new lights on the localization problem of the (potential) seizure on-set zones. The originality of our approach relies on the use of control theory. We want to apply and develop novel methods from emerging control fields such as nonlinear estimation, networked systems synchronization, hybrid systems as well as innovative source localization and reconstruction techniques that will attract the attention of the control and signal processing communities respectively. This interdisciplinary project will be carried out in the CRAN (Nancy) and will involve control and signal processing researchers as well as neuroscientists and neurologists.
more_vert assignment_turned_in ProjectFrom 2024Partners:UCA, UL, LIMOS, Laboratoire Angevin de Recherche en Ingénierie des Systèmes, Laboratoire des Sciences du Numérique de Nantes +8 partnersUCA,UL,LIMOS,Laboratoire Angevin de Recherche en Ingénierie des Systèmes,Laboratoire des Sciences du Numérique de Nantes,CNRS,INS2I,CRAN,CHU,ICL,Sigma Clermont,UCO,ENSMSEFunder: French National Research Agency (ANR) Project Code: ANR-23-CE10-0005Funder Contribution: 517,452 EURIndustry 4.0 suggests a reactive and flexible management of production lines as well as the advent of new decision support tools based on the use of data collected in real time. However, there is still a need for implementation and integration into traditional production management approaches in the broad sense. Our project seeks to answer these questions for the improvement of the articulation of tactical and operational decisions, based on new concepts of predictive and prescriptive maintenance. This will lead to new integrated approaches to planning, scheduling and maintenance, taking into account different levels of uncertainty, accompanied by new performance-oriented indicators for the robustness of production plans. In addition, this project will serve to define a set of specifications related to the data to be collected as well as associated management rules which will then form the "skeleton" of a digital shadow developed throughout the project and motivated by an industrial application in the medical device sector.
more_vert assignment_turned_in ProjectFrom 2015Partners:Institut national de recherche en sciences et technologies pour l'environnement et l'agriculture, CHU, Institut de recherche en communications et cybérnetique de Nantes, CNRS, ICL +7 partnersInstitut national de recherche en sciences et technologies pour l'environnement et l'agriculture,CHU,Institut de recherche en communications et cybérnetique de Nantes,CNRS,ICL,INRIA CENTRE RENNES - BRETAGNE ATLANTIQUE,ONERA,INS2I,Office National dEtudes et de Recherches Aérospatiales,UL,CRAN,Institut national de recherche en sciences et technologies pour lenvironnement et lagricultureFunder: French National Research Agency (ANR) Project Code: ANR-15-CE23-0021Funder Contribution: 490,462 EURThe past decade has witnessed a tremendous interest in the concept of sparse representations in signal and image processing. One of the main reasons explaining this enthusiasm stands in the discovery of compressive sensing, a new sampling paradigm defying the theoretical limits established sixty years before by Shannon. Compressive sensing led many researchers to focus on inverse problems involving fairly-well conditioned dictionaries as those arising from random, independent measurements. Yet in many applications, the dictionaries relating the observations to the sought sparse signal are deterministic and ill-conditioned. In these scenarios, many classical algorithms and theoretical analyses are likely to fail. The BECOSE project aims to extend the scope of sparsity techniques much beyond the academic setting of random and well-conditioned dictionaries. 1. Conception of new algorithms Inverse problems exploiting the sparse nature of the solution rely on the minimization of the counting function, referred to as the L0-"norm". This problem being intractable in most practical settings, many suboptimal resolutions have been suggested. The conception of algorithms dedicated to ill-conditioned inverse problems will revolve around three lines of thought. First, we will step back from the popular L1-convexification of the sparse representation problem and consider more involved nonconvex formulations. Recent works indeed demonstrate their relevance for difficult inverse problems. However, designing effective and computationally efficient algorithms remains a challenge for problems of large dimension. Second, we will study the benefit of working with continuous dictionaries in contrast with the classical discrete approach. Third, we will investigate the exploitation of additional sources of structural information (on top of sparsity) such as non-negativity constraints. 2. Theoretical analysis of algorithms The theoretical analysis aims at characterizing the performance of heuristic sparse algorithms. The traditional worst-case exact recovery guarantees are acknowledged to be rather pessimistic because they may not reflect the average behavior of algorithms. It is noticeable, though, that sharp worst-case exact recovery conditions are not even available for a number of popular L0 algorithms. We will focus on stepwise orthogonal greedy search algorithms, which are very well-suited to the ill-conditioned context. We foresee that they will enjoy much weaker recovery guarantees than simpler L0 algorithms. We further propose to elaborate an average analysis of greedy algorithms for deterministic dictionaries, which is a major open issue. To do so, several intermediate steps will be carried out including a guaranteed failure analysis and the derivation of weakened guarantees of success by taking into account other constraints on top of sparsity such as prior knowledge on the signs, coefficient values, and partial support information. 3. From theory to practice The proposed algorithms will be assessed in the context of tomographic Particle Image Velocimetry (PIV), a rapidly growing imaging technique in fluid mechanics that will have strong impact in several industrial sectors including environment, automotive and aeronautical industries. This flow measurement technique aims to determine the 3D displacement of tracer particles that passively follow the flow, based on the acquisition of a limited number of 2D camera images. The resulting inverse problem involves high-dimensional data as a time sequence of highly resolved 3D volumes must be reconstructed. Presently available methods for 3D reconstruction and flow tracking are still restricted to small volumes, which is the main bottleneck together with accuracy and resolution limits. The sparse approach is the key methodological tool to handle problems of larger dimension. The proposed solutions will be validated using both realistic simulators and real experimental data.
more_vert
chevron_left - 1
- 2
- 3
- 4
- 5
chevron_right
1 Organizations, page 1 of 1
corporate_fare Organization Francemore_vert