
LINA
15 Projects, page 1 of 3
assignment_turned_in ProjectFrom 2015Partners:INS2I, CNRS PARIS A, LINA, École Polytechnique, Université Pierre et Marie Curie +7 partnersINS2I,CNRS PARIS A,LINA,École Polytechnique,Université Pierre et Marie Curie,Laboratoire dInformatique, Signaux et Systemes de Sophia Antipolis,University of Nantes,LIX,INRIA,CNRS,Laboratoire d'Informatique, Signaux et Systèmes de Sophia Antipolis,Laboratoire dInformatique de lEcole PolytechniqueFunder: French National Research Agency (ANR) Project Code: ANR-15-CE25-0002Funder Contribution: 874,079 EURVerifying correctness and robustness of programs and systems is a major challenge in a society which relies more and more on safety-critical systems controlled by embedded software. This issue is even more critical when the computations involve floating-point number arithmetic, an arithmetic known for its quite unusual behaviors, and which is increasingly used in embedded software. Note for example the "catastrophic cancellation" phenomenon where most of the significant digits of a result are cancelled or, numerical sequences whose limit is very different over the real numbers and over the floating-point numbers. A more important problem arises when we want to analyse the relationship between floating-point computations and an "idealized" computation that would be carried out with real numbers, the reference in the design of the program. The point is that for some input values, the control flow over the real numbers can go through one conditional branch while it goes through another one over the floating-point numbers. Certifying that a program, despite some control flow divergences, computes what it is actually expected to compute with a minimum error is the subject of the robustness or continuity analysis. Providing a set of techniques and tools for verifying the accuracy, correctness and robustness for critical embedded software is a major challenge. The aim of this project is to address this challenge by exploring new methods based on a tight collaboration between abstract interpretation (IA) and constraint programming (CP). In other words, the goal is to push the limits of these two techniques for improving accuracy analysis, to enable a more complete verification of programs using floating point computations, and thus, to make critical decisions more robust. The cornerstone of this project is the combination of the two approaches to increase the accuracy of the proof of robustness by using PPC techniques, and, where appropriate, to generate non-robust test cases. The goal is to benefit from the strengths of both techniques: PPC provides powerful but computationally expensive algorithms to reduce domains with an arbitrary given precision whereas AI does not provide fine control over domain precision, but has developed many abstract domains that quickly capture program invariants of various forms. Incorporating some PPC mechanisms (search tree, heuristics) in abstract domains would enable, in the presence of false alarms, to refine the abstract domain by using a better accuracy. The first problem to solve is to set the theoretical foundations of an analyser based on two substantially different paradigms. Once the interactions between PPC and IA are well formalized, the next issue is to handle constraints of general forms and potentially non-linear abstract domains. Last but not least, an important issue concerns the robustness analysis of more general systems than programs, like hybrid systems which are modeling control command programs. Research results will be evaluated on realistic benchmarks coming from industrial companies, in order to determine their benefits and relevance. For the explored approaches, using realistic examples is a key point since the proposed techniques often only behave in an acceptable manner on a given sub classes of problems (if we consider the worst-case computational complexity all these problems are intractable). That's why many solutions are closely connected to the target problems.
more_vert assignment_turned_in ProjectFrom 2012Partners:UL, University of Nantes, DGDS, Institut National de Recheche en Informatique et en Automatique, INIST +8 partnersUL,University of Nantes,DGDS,Institut National de Recheche en Informatique et en Automatique,INIST,Institut de lInformation Scientifique et technique,LINA,CNRS,ATILF,Laboratoire de Linguistique et Didactique des Langues étrangères et maternelles EA 609,INSHS,Laboratoire dInformatique de Nanctes Atlantique UMR 6241,Centre de Recherche Inria Nancy - Grand EstFunder: French National Research Agency (ANR) Project Code: ANR-12-CORD-0029Funder Contribution: 710,718 EURThe collaborative research project TermITH (Terminology and Indexation of Texts in the area of Humanities) merges six French partners : ATILF (Analysis and Natural Language Processing of French Language), INIST (National Institute of Scientific and Technical Information), LINA (Laboratory of Computer Science from Nantes), LIDILEM (Laboratory of Linguistics and Applied Linguistics of native and second languages from Grenoble) and two INRIA Centers (National Institute of research in Computer Science and Automatics), INRIA Nancy Grand-Est and INRIA Saclay. This project deals with information access to textual documents via a full-text indexing which is based on terms which are detected, disambiguated and analyzed. This issue is well-known: the digital age is characterized by a very large quantity of information that has to be indexed to allow access to it, by the growing diversity of the areas and disciplines which entails a more and more frequent interdisciplinary. Text indexing based on terms occurring still is a hot research topic though different approaches have recently provided some good results. These approaches use either occurrences of terms which are detected on the basis of their textual form (projection of controlled vocabularies or structured terminologies using pattern matching, inflection rules, syntagmatic variations like for instance FASTR), or term candidates which result from some automatic terms detection components. All these methodologies require expensive human verification: (1) for indexing: manual checking of the automatically defined indexes or even, complete analysis of documents in order to define the good indexes of these documents, (2) for the automatic terms detection: classification of the very large amount of terms candidates, (3) for the projection of controlled vocabularies or structured terminology: updating of the terminological resources. TermITH’s approach is intended to cross the automatically detected and disambiguated occurrences of terms in texts with available interdisciplinary lexicons and terminological resources to isolate the specific terms for each studied area. Such an approach has two main advantages. First, it limits the human cost for the manual evaluation of indexes of documents and the manual analysis of documents if necessary. This results from the disambiguation and the crossing with interdisciplinary lexicons and terminological resources. Second, it will permit an automatic updating of terminological resources. From the theoretical point of view, TermITH will allow cross-fertilization of disciplines which grow in parallel for the moment: contextual disambiguation, data mining and textual statistics for terms disambiguation; automatic terms detection, terminological resources projection and interdisciplinary lexicons for terms detection and index of them in texts. In the first experimental phase, TermITH actors have chosen to work within a scientific area in which the ambiguity between terminological and general language usage is very high: the humanities. The projected methodology will be tested for linguistics and then validated with four other disciplines: history, sociology, psychology (analytic and social psychology, and cognitive sciences) and archeology. If the results are good for these five ambiguous disciplines, the indexation of documents which deal with less ambiguous disciplines (like biology, genetics, physics and so on) will be easier with our methodology.
more_vert assignment_turned_in ProjectFrom 2014Partners:LINA, Laboratoire dInformatique Algorithmique: Fondements et Applications, Institut de recherche en communications et cybérnetique de Nantes, LIAFA, Department of Computer Science, Aalborg University +3 partnersLINA,Laboratoire dInformatique Algorithmique: Fondements et Applications,Institut de recherche en communications et cybérnetique de Nantes,LIAFA,Department of Computer Science, Aalborg University,Laboratoire dInformatique de Paris Nord,Laboratoire d'Informatique de Paris Nord,University of NantesFunder: French National Research Agency (ANR) Project Code: ANR-14-CE28-0002Funder Contribution: 453,896 EURModel-checking and formal modelling are now techniques with a certain academic recognition, but their applicability in practice remain somewhat inferior to expectations. This is in part due to two main problems: rather rigid modelling of the systems impairing abstraction and scalability, and insufficient feedback from the verification process. In this project, we address these issues by lifting these techniques to the more flexible and rich setting of parametrised formal models. In that setting, some features of the system like the number of processes, the size of some buffers, communication delays, deadlines, energy consumption, and so on, may be not numerical constants, but rather unknown parameters. The model-checking questions then become more interesting: is some property true for all values of the parameters? Or does there exist some value such that it is? Or even what are all the possible values such that it is? Building on the skills of the consortium on topics like regular model-checking, timed systems, probabilistic systems, and our previous contributions in model-checking of systems with a parametrised number of processes and parametrised timed systems, including the development of software tool prototypes, we develop in this project new models, techniques, and tools to extend the applicability of parameters in formal methods. To achieve this objective, we study parameters in the context of discrete and timed/hybrid systems, both of them possibly augmented with quantitative information relating to costs (e.g. energy consumption), and probabilities. This gives the following six tasks: 1. Discrete parameters 2. Timing parameters 3. Discrete and timing parameters 4. Parameters in cost-based models 5. Parameters in discrete models with probabilities 6. Parameters in timed models with probabilities Parametrised models are of obvious interest but the associated theoretical problems are hard. For instance, in the model of parametric timed automata, the basic model for timed systems with time parameters, the mere existence of a parameter value such that the system can reach some given state, is generally undecidable, even with only one parameter. As a consequence, in all these tasks, we follow a common methodology, acknowledging these difficulties, and consisting in formalising the problem, studying decidable subclasses and designing efficient algorithms for the the parametrised model-checking problems (including in particular parameter synthesis), building efficient semi-algorithms for the general class that behave well in realistic cases, and finally implement the techniques in tool prototypes. This raises many challenging and original problems like extending regular model-checking to graphs to model parametrised systems with an arbitrary topology, using infinite-state automata to represent sets of configurations, finding useful decidable classes of parametrised timed/hybrid systems or properties, provide techniques for approximate synthesis of parameter values, study models with parametrised costs, study probabilistic parametric models, and extend statistical verification techniques to parametric systems. We aim at producing high-quality scientific results, published in the recognized venues of the formal methods and model-checking communities, but also at producing software tool prototypes to make these results available in practice, both for the research community and for higher education. Finally, we want to promote the field of parametrised model-checking through the organisation of a yearly open workshop, as a scope-extended version of the SynCoP workshop organised in 2014. Beyond the classical application fields of formal methods (e.g. embedded systems), we envision new applications domains like smart homes, where parameters may account for the specifics of the residents. In that setting, cost-based parametrised models are particularly interesting for a focus on optimising energy consumption.
more_vert assignment_turned_in ProjectFrom 2013Partners:LINA, University of Nantes, University of Rennes 1, CRAL, INSHS +4 partnersLINA,University of Nantes,University of Rennes 1,CRAL,INSHS,CNRS,CENTRE ATLANTIQUE DE PHILOSOPHIE,EHESS,EHESSFunder: French National Research Agency (ANR) Project Code: ANR-12-CULT-0003Funder Contribution: 295,996 EURThis project aims at studying the "digital turn", that is, the transition of a cultural realm defined by the presence of physically available media content to a world where digitized media, that is likely to deeply reconfigure our ways of being in the world. In order to analyze this "digital revolution", we will focus on artistic contexts, and, more specifically, on music in everyday life, and its socio-technical reconfiguration. We have chosen this relatively narrow subject to conduct our investigations for a series of reasons. Firstly, music consumption is a widely shared social experience in which technological innovation - Acetate to Mp3 - is a key element in the transformation. Second, illegal downloading and peer to peer file sharing have become major issues of public debate, legislation and intervention. Finally, our everyday experience of music is not limited to listening, but engages our identities, our conception of time, our emotions and attachments, our ways of understanding and articulating public and private space, etc., in short, it constitutes a very complete social experience.Furthermore, we are seeing since the early 2000s a turning point where the model of the music business goes through an almost uninterrupted market recession, while simultaneously, the digital music sales expand significantly. Indeed, digitization of content - what Fabien Granjon and Clément Combes have named the "digitamorphosis," succeeding Antoine Hennion’s "discomorphosis", drives a shift in the way amateurs relate to musical content. The rules of listening and interpretation are not immutable. An entire century of music recordings and experiments has already transformed what we expect from music, its creative processes, and the soundscapes and formats that make our everyday musical experience. However, these changes depend on discrete value alterations accumulating over time, and, among other things, on a standardization of traditional codes and more recent practices. In short, as digital schemes develop, the limits of what is acceptable are modified: listening, possessing, sharing or archiving are experiences that are evolving due to streaming technologies, the co-existence of multiple listening devices (personal computer, home stereos, portable music players), and the presence of musical content in social networks. Thus, digitization of music subverts the dominant paradigm of media and medium as a merged whole (tape, acetate, CD), suggesting then the possibility of a new paradigm: that of music as a service and not just a data. We could be going from a product-based society to a society of experience. In order to carry out this project in which cultural sociologists, ethnomusicologists and computer science specialists participate from three partner laboratories (Atlantic Centre of Philosophy at the University of Nantes, Nantes Computing Laboratory ; Arts and Language Research Center at EHESS), we will set three goals: first, to establish a chronological sequence of the "digital turn", bringing about simultaneously a reflective analysis on what it means to take a socio-historical approach on this type of transformation. Second, we seek to understand how the shift from an analog culture to a digital one, as well as the appearance of a “native-digital” generation, may transform our every day musical experience. Finally, we will consider the hypothesis of digital technology (and especially social networks) as a lever of transformation of the traditional paradigms that shape our present understanding of musical taste, legal frameworks for musical consumption and political ideals of democracy through the Internet.
more_vert assignment_turned_in ProjectFrom 2015Partners:LINA, UNIVERSITE NICE SOPHIA ANTIPOLIS Laboratoire Symbiose Marine, SBR, University of Nantes, UNIVERSITE NICE SOPHIA ANTIPOLIS Institut de Chimie de Nice.LINA,UNIVERSITE NICE SOPHIA ANTIPOLIS Laboratoire Symbiose Marine,SBR,University of Nantes,UNIVERSITE NICE SOPHIA ANTIPOLIS Institut de Chimie de Nice.Funder: French National Research Agency (ANR) Project Code: ANR-15-CE02-0011Funder Contribution: 588,910 EUROceanic environments are sensitive to climate change. The most emblematic example is the bleaching of coral reefs due to the breakdown of the symbiosis between the coral and its dinoflagellate microalgal symbionts in response to temperature rise. Such mutualistic photosymbioses are not only a highly significant process for evolution, but also a key ecological interaction supporting the functioning of whole ecosystems. While corals are highly symbolic and attract most of the research attention, they represent only a small fraction of photosymbioses in the global ocean. In the plankton, arguably one of the least explored compartments of the biosphere, photosymbiotic relationships with dinoflagellates are frequently observed and hold a key position in pelagic ecosystems. Considering the obvious significance of plankton on the one hand and photosymbiosis on the other, coral bleaching could just be the tip of the iceberg of a more global unnoticed phenomenon occurring at the surface of all oceans. “Plankton bleaching”, which has been reported from fossil records to have already occurred in the middle Eocene (40 Million years ago), could have a considerable impact on oceanic ecosystem structure and function. We hypothesize that fundamental biological processes underlying benthic and planktonic photosymbioses are based on common molecular pathways and have been selected to respond to similar environmental settings. In this context, in the IMPEKAB project we will seek i- to evaluate the sensibility of planktonic photosymbiosis to environmental changes, ii- to unveil fundamental biological processes involved in the response of photosymbiosis to thermal stress across eukaryotic lineages by comparing outcomes of similar experiments performed on marine benthic and planktonic host models. Finally, based on our comprehensive understanding of photosymbiosis stress processes and considering that plankton bleaching could impact the whole pelagic ecosystem, we will iii- apply an original eco-systems biology approach to evaluate the thermal stress response of planktonic photosymbiosis in the environment. The research strategy we propose in order to attain our objectives is based on promising preliminary results obtained on the sea anemone and the Radiolaria, our ecologically relevant benthic and planktonic biological models, respectively. Taking advantage of the strong expertise of the partners on their respective biological models, we have developed a carefully planned experimental strategy. Briefly, once the physiological characteristics of the models with respect to temperature variation have been defined, we will use this framework to conduct experiments to decipher the genetic and metabolic responses associated to thermal stress. Data acquired and outcomes of comparative analysis between experimental conditions and biological models will be used to implement innovative modelling approaches to integrate all results into an environmental context. Ultimately we aim to achieve a detailed physio-genomic understanding of how marine photosymbiosis responds to environmental stressors in order to develop tools that will facilitate in situ monitoring of the potentially critical "plankton bleaching" phenomenon. This joint initiative, ambitious and highly original in the national and international research landscape, will involve recognized scientists from the fields of biology, ecology, chemistry, bioinformatics, and computational modelling, promoting strong exchanges of concepts and technical expertise and contributing to the development of the novel “eco-systems biology” field. As a consortium, we will implement a carefully organized strategy for scientific dissemination, outreach, and valorization of the results obtained in IMPEKAB.
more_vert
chevron_left - 1
- 2
- 3
chevron_right
1 Organizations, page 1 of 1
corporate_fare Organization FranceWebsite URL: http://www.univ-nantes.fr/more_vert