
IBISC
9 Projects, page 1 of 2
assignment_turned_in ProjectFrom 2021Partners:CIC PARIS-EST (Centre dinvestigation clinique), INSERM, AP-HP, Vanderbilt University Medical Center / Rodens Lab, UEVE +7 partnersCIC PARIS-EST (Centre dinvestigation clinique),INSERM,AP-HP,Vanderbilt University Medical Center / Rodens Lab,UEVE,PRES,IRD,VUMC,University of Paris-Saclay,Research Unit on Cardiovascular, Metabolic and Nutrition Diseases,UMMISCO,IBISCFunder: French National Research Agency (ANR) Project Code: ANR-20-CE17-0022Funder Contribution: 570,520 EURSome cardiovascular diseases (such as congenital long QT syndrome, cLQTS) or drug-induced long QT syndrome (diLQTS), can cause a particular form of ventricular arrhythmia called Torsade de Pointes (TdP). While often self-terminating, TdP can degenerate leading to death. There are three main forms of cLQTS: type 1, caused by mutations in cardiac channel genes leading to IKs current blockade; type 2, IKr blockade, and type 3, INaL activation. On Electrocardiogram (ECG), QT is prolonged in all these latter conditions, but ECG waveforms carry specificities including T-wave morphology abnormalities specific of each type of cLQTS. Most drugs responsible for diLQTS and eventually TdP, can be identified by assessing any of these mechanisms on the ECG. Therefore, diLQTS and cLQTS type 2 carry similarities in their ECG footprints. Regulatory agencies require new drugs to undergo thorough QT studies. It is however established that limiting the ECG evaluation to QT measurements is poorly predictive of TdP. Prediction of TdP risk and the characterization of the molecular mechanisms involved is of major interest for patients suspected to carry cLQTS as it is for other patients receiving drugs that may cause TdP. This is also a major issue for the pharmaceutical industry when developing new drugs. Finally, most physicians prescribing these drugs are unable to correctly quantify QT or evaluate TdP risk and do are not able to immediately consult with expert cardiologists. Automatized personalized prediction for TdP risk of cLQTS or diLQTS patients, who may be unknown, can improve the accuracy of physician assessment and lower the risk of adverse events. In this project, we aim to develop such a user-friendly tool using artificial intelligence, which is rapidly reaching medical practice. Deep learning (DL) in particular has brought a radical change in the field of pattern recognition and machine learning (ML) itself, improving most of the earlier models devoted to learning tasks such as image classification and natural language processing. Specifically, in cardiology, DL has recently been used for several applications, including the detection of various types of common arrhythmia such as atrial fibrillation, myocardial infarction, and cardiac contractile dysfunction. However, the use of DL in predicting TdP events in a drug-induced and congenital context has not yet been explored. In this project, we will use such algorithms that provide models that will not only increase precision but also provide clinicians with interpretable novel features and representations that could improve patient stratification. We expect that ECG annotation with the signal features that most influence the prediction would improve the understanding of the molecular mechanisms underlying TdP. Within the consortium, we have already explored the DL hypothesis and our preliminary results are very encouraging. The objective of our 3.5 year-long project is to advance this research topic, transform it into a translational application in several pilot cardiology departments during a first phase and design validation clinical trials during a second phase for widespread use in and out of hospital settings. This project will involve 5 teams. Teams 1, 4 are experts on AI and Interpretability, and Teams 2,3,5 are world-class experts in cardiology (cLQTS, diLQTS, and TdP). Moreover, they come with unique valuable datasets. The project will last 42 months. Key results will include (i) an integrated data repository (ii) DL models that predict TdP (iii) patient stratification and interpretability and (iv) a clinical application. The project is in full coherence with the government’s and the ANR’s objectives in accelerating AI-based translational applications in medicine and will most likely strengthen the position of France in the international arena.
more_vert assignment_turned_in ProjectFrom 2017Partners:IBISC, UEVE, INNODURA TB, Institut National de Recherche en Informatique et Automatique, University of Paris-SaclayIBISC,UEVE,INNODURA TB,Institut National de Recherche en Informatique et Automatique,University of Paris-SaclayFunder: French National Research Agency (ANR) Project Code: ANR-17-MALN-0004Funder Contribution: 699,613 EURThe goal of this challenge is to develop and experiment accurate location solutions for emergency intervention officers and security forces. These solutions must be efficient inside buildings and more generally in conditions where satellite positioning systems do not work satisfactorily. The objective is to improve competences in autonomous location solutions (trajectography) in disturbed environment. Secondary functions such as agent orientation and 3D cartography (3D mapping) of visited areas are also expected. Our LOCA-3D Project (Location, Orientation and 3D CArtography) aim is to overcome this lack of knowledge in technical and technological aspects of indoor location while respecting the challenge constraints. Our solution based on combination of several sensors (inertial and optical) allows the primary locating function and the secondary functions (orientation and 3D mapping) to be executed. The concept is based on an advanced inertial system allowing the agent trajectory to be calculated. A part of the inertial sensor drift is compensated by the vision system which simultaneously generates scatter plots. The trajectory reconstitution of the moving system allows these scatter plots to be referenced in a global system. Transversely, the solution uses robust methods of 3D reconstruction from scatter plots and measurement noise filtering in order to keep the coherent part of the information. The 3D mapping will be executed by off-line and progressive reconstruction. The proposed consortium is complementary. Each partner will bring a strong added value to the development of the global solution.
more_vert assignment_turned_in ProjectFrom 2015Partners:University of Paris-Saclay, UEVE, IBISCUniversity of Paris-Saclay,UEVE,IBISCFunder: French National Research Agency (ANR) Project Code: ANR-15-CE40-0015Funder Contribution: 234,291 EURThe traditional design and analysis of algorithms assumes that complete knowledge of the entire input is available to an algorithm. However, in many cases the input is revealed online over time, and the algorithm needs to make its current decision without knowledge of the future. For example, scheduling jobs that arrive over time, managing a portfolio of stocks, making prediction based on expert advice and so on. Thus, the main issue in online computation is obtaining good performance in the face of uncertainty due to inputs arriving sequentially, one at a time. Besides, the emerging of massive data problems gives rise to the need of algorithms which solve problems while reading the input with a single pass in the sense of online algorithms. Online computation is a well-established and active fields. Many interesting algorithms with performance guarantee and deep techniques have been designed. However, the current set of techniques does not provide effective means to study problems whose nature is non-linear or even non-convex such as the problems in the contexts of super-modular (energy minimization) or sub-modular function optimization. Together with the widely-open status of many fundamental problems, the development of new principled approaches is crucial for the advance of the field. The other main issue, well-identified in online computation, is the weakness of the worst case paradigm. Summarizing an algorithm performance by a pathological worst case can overestimate its performance on most inputs. Many practically well-performed algorithms admit mediocre theoretical guarantee whereas theoretically established ones behave poorly even on simple instance in practice. Several models have been introduced beyond the worst case analysis. Each model has successfully explained the performance of algorithms in certain contexts but has limits in other classes of problems. A common point of those models is that they employ the same tools to analyze algorithms as in the worst-case model. The lack of appropriate tools is a primary obstacle for the development of the models. In the project, our first objective is to establish principled methods for the design and analysis of online algorithms beyond the current techniques. We aim to develop tools to study convex and non-convex problems. The second objective is to investigate new models that measure accurately the algorithm performance beyond the worst-case analysis. In particular, we are interested in the foundation of a general model of resource augmentation unifying previous apparently unrelated models. We propose novel approaches based on the duality paradigm of mathematical programming as traversal tools in both objectives. Beside of theoretical interests, the advance of the project could give significantly improved algorithms in the context of computational sustainability (e.g., energy-aware scheduling) and that of massive data (e.g., sub-modular optimization).
more_vert assignment_turned_in ProjectFrom 2022Partners:Fondation Saint-Cyr / Recherche, INS2I, HEUDIASYC, IBISC, UNIVERSITE DE TECHNOLOGIE DE COMPIEGNE +8 partnersFondation Saint-Cyr / Recherche,INS2I,HEUDIASYC,IBISC,UNIVERSITE DE TECHNOLOGIE DE COMPIEGNE,CENTRE DE RECHERCHE EN PSYCHOLOGIE DE LA CONNAISSANCE, DU LANGAGE ET DE LEMOTION,UTC ,UEVE,CNRS,Institut de recherche biomédicale des armées,CENTRE DE RECHERCHE EN PSYCHOLOGIE DE LA CONNAISSANCE, DU LANGAGE ET DE L'EMOTION,Université de technologie de Compiègne -laboratoire CONNAISSANCE ORGANISATION ET SYSTEMES TECHNIQUES,University of Paris-SaclayFunder: French National Research Agency (ANR) Project Code: ANR-21-CE39-0015Funder Contribution: 700,146 EURThe Covid-19 pandemic, by reducing physical exchanges, led to unprecedented peaks in digital usage and data security breaches. The Internet of Things, big data, mobility and teleworking amplify the risks in terms of cybersecurity. Healthcare and Defense present similarities in terms of constraints and digital uses, including the need to ensure the reliability and confidentiality of data. Through a transversal approach (cognitive, social, technical, ethical and legal issues) to security, from design to use, our objective is to demonstrate the value of considering man as a full-fledged member of a complex system that must adapt to its vulnerabilities in order to increase the security and resilience of systems. Recommendations to better integrate human factors from the design stage will be proposed as well as a targeted immersive educational program to help change user behaviors.
more_vert assignment_turned_in ProjectFrom 2017Partners:IBISC, University of Paris-Saclay, UEVEIBISC,University of Paris-Saclay,UEVEFunder: French National Research Agency (ANR) Project Code: ANR-16-CE40-0021Funder Contribution: 153,171 EURIn recent years data-aware systems have been proposed as a comprehensive framework to model complex business workflows by considering data and processes as equally relevant tenets of the system description [19, 28]. This setting is particularly suited to model auctions and auction-based mechanisms in electronic commerce [20]. The SVeDaS project is designed to advance the state-of-the-art in the modelling, analysis and deployment of data-aware systems by using a novel, compositional, agent-based approach to their specification and verification. The main objectives of the SVeDaS project can be summarized as follows: 1. to introduce agent-based, computationally-grounded models for data-aware systems, that are capable of expressing rich business workflows, including auction-based mechanisms and e-markets; 2. to explore logic-based formal languages for the specification of strategic behaviours of autonomous agents (including robustness against malicious behaviours, as well as manipulability and collusion in auctions) pertaining to business processes and agents operating on them; 3. to analyse the formal properties of the data-aware models, particularly the issues concerning formal verification by model checking in contexts of imperfect information; 4. to find classes of data-aware systems and expressive logical fragments that have a decidable model checking problem, and possibly are also amenable to practical verification; 5. to develop model checking tools and techniques for the verification and validation of data-aware systems in multi-agent scenarios, with a focus on auctioning mechanisms. We anticipate that the results of the SVeDaS project will contribute significantly to our understanding of data-aware systems, thus improving the design and management of business processes by formal verification through model checking. In turn, these contributions will help building more secure and reliable systems, as well as reducing the costs of faults in auction-based mechanisms for e-commerce and e-business.
more_vert
chevron_left - 1
- 2
chevron_right
2 Organizations, page 1 of 1
corporate_fare Organization FranceWebsite URL: http://www.universite-paris-saclay.fr/frmore_vert corporate_fare Organization FranceWebsite URL: http://www.univ-evry.fr/en/index.htmlmore_vert