
Ultraleap
Ultraleap
7 Projects, page 1 of 2
assignment_turned_in Project2021 - 2025Partners:myenergi Ltd., Nat Inst of Industrial Eng NITIE Mumbai, Construction Scotland Innovation Centre, Airbus Operations Limited, SP Technology Automation and Robotics +98 partnersmyenergi Ltd.,Nat Inst of Industrial Eng NITIE Mumbai,Construction Scotland Innovation Centre,Airbus Operations Limited,SP Technology Automation and Robotics,Norscot Joinery Limited,Shadow Robot Company Ltd,RAR UK Automation Ltd.,Fanuc Robotics (U K) Ltd,Ultraleap,Norscot Joinery Limited,MAKAR Ltd,AIRBUS OPERATIONS LIMITED,Fraunhofer HHI,KUKA Robotics UK Limited,Inovo Robotics,Expert Tooling and Automation Limited,Sunamp Limited,CAS,CNC Robotics Ltd,Rolls-Royce (United Kingdom),SP Technology Automation and Robotics,GT,Electroimpact UK Limited (UK),Fraunhofer HHI,HSSMI Ltd,RAR UK Automation Ltd.,Royal Institute of Technology KTH Sweden,Soil Machine Dynamics UK,Liberty Produce,BAE Systems (United Kingdom),Fraunhofer Heinrich Hertz Institute,Fraunhofer IPA,Be-St,HAL Robotics Ltd (UK),Spirit Aerosystems,Claromech Limited,Stewart Milne Group,Scottish Agricultural Org Society (SAOS),University of Birmingham,Cambrian Intelligence,The Shadow Robot Company,KTH,Expert Tooling and Automation Limited,BAE Systems (Sweden),Fanuc Robotics (U K) Ltd,ROLLS-ROYCE PLC,Ocado Limited,Fraunhofer IPA,Claromech Limited,GKN Aerospace,SUNAMP LIMITED,National Institute of Industrial Engineering,Measurements Solutions Ltd.,University of Patras,HSSMI Ltd,Chinese Academy of Sciences,Inovo Robotics,Teknek Limited,Arrival Ltd,Toyota Motor Manufacturing (UK) Ltd,The Manufacturing Technology Centre Ltd,True Position Robotics Ltd,GKN Aerospace Services Ltd,myenergi Ltd.,Loughborough University,IntelliDigest,Agri-EPI Centre,Measurement Solutions Ltd.,Arrival Ltd,BAE Systems (UK),HAL Robotics Ltd (UK),IntelliDigest,Bae Systems Defence Ltd,Liberty Produce,Teknek Limited,Ultraleap,ElectroImpact,Chinese Academy of Science,KUKA Robotics UK Limited,Scottish Agricultural Org Society (SAOS),iRob International Ltd.,Georgia Institute of Technology,Airbus (United Kingdom),Stewart Milne Group,Soil Machine Dynamics UK,Spirit Aerosystems (UK),Ocado Group,Constellium UK Ltd,iRob International Ltd.,CNC Robotics Ltd,Cambrian Intelligence,MTC,Loughborough University,Scorpion Vision Limited,Constellium UK Ltd,MAKAR Ltd,True Position Robotics Ltd.,University of Patras,Rolls-Royce (United Kingdom),Toyota Motor Manufacturing Ltd,Scorpion Vision Limited,Kuka LtdFunder: UK Research and Innovation Project Code: EP/V062158/1Funder Contribution: 4,821,580 GBPThe UK has fallen significantly behind other countries when it comes to adopting robotics/automation within factories. Collaborative automation, that works directly with people, offers fantastic opportunities for strengthening UK manufacturing and rebuilding the UK economy. It will enable companies to increase productivity, to be more responsive and resilient when facing external pressures (like the Covid-19 pandemic) to protect jobs and to grow. To enable confident investment in automation, we need to overcome current fundamental barriers. Automation needs to be easier to set up and use, more capable to deal with complex tasks, more flexible in what it can do, and developed to safely and intuitively collaborate in a way that is welcomed by existing workers and wider society. To overcome these barriers, the ISCF Research Centre in Smart, Collaborative Robotics (CESCIR) has worked with industry to identify four priority areas for research: Collaboration, Autonomy, Simplicity, Acceptance. The initial programme will tackle current fundamental challenges in each of these areas and develop testbeds for demonstration of results. Over the course of the programme, CESCIR will also conduct responsive research, rapidly testing new ideas to solve real world manufacturing automation challenges. CESCIR will create a network of academia and industry, connecting stakeholders, identifying challenges/opportunities, reviewing progress and sharing results. Open access models and data will enable wider academia to further explore the latest scientific advances. Within the manufacturing industry, large enterprises will benefit as automation can be brought into traditionally manual production processes. Similarly, better accessibility and agility will allow more Small and Medium sized Enterprises (SMEs) to benefit from automation, improving their competitiveness within the global market.
more_vert assignment_turned_in Project2023 - 2025Partners:ADVANCED MANUFACTURING RESEARCH CENTRE, Autodesk Ltd, Ultraleap, University of Bristol, Bristol Digital Futures Institute +8 partnersADVANCED MANUFACTURING RESEARCH CENTRE,Autodesk Ltd,Ultraleap,University of Bristol,Bristol Digital Futures Institute,The Product Partnership,Bristol Digital Futures Institute,National Metals Technology Centre,University of Bristol,The Product Partnership,Autodesk,Ultraleap,AMRCFunder: UK Research and Innovation Project Code: EP/W024152/1Funder Contribution: 344,002 GBPTo design the future of products we need the future of prototyping tools. Across the £30Bn+ consumer product markets, priorities such as demand for non-technical user voice vie against advanced products and tough time/cost targets. These pressures are acutely felt in the prototyping process, where models often number in the 100s for a single product, and are inflexible, technically advanced, and resource-intensive to create. To succeed and evolve prototyping needs to do more, quicker, cheaper, with higher accessibility. This project aims to enhance learning, accessibility, and efficiency during prototyping. It will explore feasibility and value of seamlessly integrating physical and digital prototyping into a single workflow. Recent and rapidly emerging technologies such as mixed reality, haptic interfaces, and gesture control have revolutionised the way we interact with the digital world. It's predicted that this tech will be ubiquitous by 2025, will be disruptive for the next decade, and will drive the way we work and interact across the future digital workplace, with engineering a top-5 sector to realise value. In prototyping, they will break down the physical-digital divide and create seamless experiences, where the strengths of each domain are realised simultaneously. This new physical-digital integrated workflow brings profound opportunities for both engineers and users, supporting technical activities and simplifying communication. Amongst many possibilities users may physically create and feel digital changes to prototypes in real-time, dynamically overlay advanced analyses onto physical models, and support early-stage decision-making with physical-digital, tactile, interactive prototypes. These capabilities will allow more learning per prototype, widen accessibility to technical design and streamline the prototyping process. However, we don't yet know how this exciting vision may be fulfilled, exactly what benefits, value or costs there may be, feasibility of implementation, or effective workflow approaches. The project will explore physical-digital workflow by creating and investigating several demonstrator platforms that combine and apply haptic, mixed reality, and gesture control technologies in targeted prototyping scenarios. Technologies will be explored to understand capability in isolated sprints, before prioritisation and development into focused demonstrator tools that allow us to explore integrated workflow across real prototyping cases, spanning activities, types, and stakeholders. Demonstrators will be evaluated and verified with end-users, industry partners, and the public to establish learning, speed, cost, and usage characteristics. Project outcomes will comprise workflows for integrated prototyping with knowledge of value, effectiveness, feasibility, and future opportunities. A 'toolkit' of implementations will also provide exemplars for industrial partners and academia and lead the effective use of integrated physical-digital workflow in engineering. All software and hardware will be open-sourced via Github and the project webpage, letting global researchers and the public create their own systems and build upon the work. Future work will extend capabilities in line with outcomes of the work, leading to the next generation of engineering design and prototyping tools. Industrial Partners The Product Partnership (Amalgam, Realise Design, and Cubik) and AMRC will bring prototyping, engineers, and end-user expertise and benefit from the workflows and technologies that are developed. OEMs Ultraleap and Autodesk will bring immersive technology expertise and access to cutting edge design systems, and will benefit from case study implementations and studies and future application opportunities. Bristol Digital Futures Institute will facilitate collaboration across 20+ partner businesses and the public, with outputs supporting their mission for digital solutions that tackle global problems.
more_vert assignment_turned_in Project2021 - 2026Partners:Ultraleap, Ultraleap, UCLUltraleap,Ultraleap,UCLFunder: UK Research and Innovation Project Code: EP/V037846/1Funder Contribution: 916,580 GBPThe UK is a world-leader in creating interactive applications that are enabled by computational manipulation of acoustic wave fronts. Three examples of such applications include mid-air haptics, directional audio, and volumetric 3D particle displays. Using a phased array of ultrasonic speakers that are precisely and individually controlled, we can create high pressure focal points in mid-air. Modulating these focal points in various ways, it is possible to 1) create rich tactile sensations when touched by the bare hands, 2) steer directional sounds that propagate over long distances un-attenuated, 3) levitate small particles that when rapidly moved in space emulate volumetric 3D shapes due to a phenomenon called persistence of vision. The exploitation of these amazing new technologies is uniquely available to Ultraleap (a UK based company) and Sussex University who have a long and productive history of collaboration. For example, Ultraleap is currently combining hand-tracking and ultrasonic mid-air haptic feedback solutions for applications ranging from VR training simulators, automotive interfaces, gaming machines, and next generation digital signage kiosks. Similarly, Sussex University is creating multimodal 3D displays based on rapidly updating ultrasonic phased arrays to create persistence of vision when moving acoustically levitated objects. A significant constraint to the wide-scale deployment of the underlying technology of phased arrays is the cost and complexity of a non-modular system because it limits applicability. For example, there is no one size fits all phased array with most integrated solutions needing to be custom developed. In this project, we will circumvent such problems altogether by creating simple and low cost modular spatial sound modulator (SSM) units i.e. smaller arrays of acoustic sources, to be placed around the interactive space, that can collectively out-perform the single large monolithic solution we currently have. Moreover, we will take a leap forward in sound-field control by removing scalability and reusability issues thus opening up the exploitation of phased array technologies into other applications domains that can benefit from the non-contact delivery of haptic feedback, steerable directional sound, and/or volumetric 3D particle displays. Specifically, we will draw on the well-developed literature of multi-agent game theory and distributed computing and use them to build a decentralised swarm architecture that can flexibly accommodate numerous SSM units. Each SSM will emit and modulate the sound-field nearby it while sharing a common awareness of the contextual details with its swarm host, while the desired collective behaviour will emerge from the interactions between multiple SSMs and their interactions with the environment. There are several anticipated benefits to our proposed approach. Firstly, by designing simple, independent SSMs we are able to address multiple commercial applications using the same primitive unit, while simultaneously streamlining the manufacturing pipeline. These modular units can be used individually or combined in a myriad of ways to create new applications. Secondly, by enabling a distributed control architecture, swarm SSMs can seamlessly and progressively scale up. By incrementally combining larger numbers of modular units, customers can initially use a small number of SSM units, and dynamically grow the capabilities of their interactive multimodal system by adding new devices according to the application needs. Finally, by using game theory we will enable SSMs to dynamically cooperate with each other as to meet application objectives independent of the application logic and the arrangement of our modular devices thus simplifying the development and design process and enabling creative designers to focus on the delivery of evermore immersive and multimodal experiences.
more_vert assignment_turned_in Project2021 - 2025Partners:Neurosketch, Oxfam, Pentland Brands, IDEO, Ultraleap +84 partnersNeurosketch,Oxfam,Pentland Brands,IDEO,Ultraleap,Fashion District,Presca Teamwear,University of Warwick,Manor Farms,UK-CPI,Circular Systems,IBM Hursley,UK-CPI (dup'e),Laudes Foundation,Reskinned Resources Ltd,University of Abertay Dundee,Wandsworth Borough Council,London Cloth Company,THP,Swift Analytical LTd,Fashion Revolution,UK Fashion & Textile Association,UK Fashion & Textile Association,LMB Textile Recycling (Lawrence M Barry),University of Warwick,Wandsworth Borough Council,Reskinned Resources Ltd,IDEO,Neurosketch,Fashion District,HKRITA,Laudes Foundation,Abertay University,H&M Foundation,Technical Fibre Products Ltd,Yoox Net-a-Porter Group,Fashion for Good BV,Fashion for Good BV,NYC Economic Development Corpration,SharpEnd,Novozymes A/S,Henry Royce Institute,Novozymes A/S,IBM Hursley,Business Growth Hub,EPSRC Future Composites ManufacturingHub,ReLondon,Royal College of Art,Fashion Revolution,Wilson Biochemicals Ltd,HKRITA,SUEZ RECYCLING AND RECOVERY UK LTD,University of Portsmouth,ReLondon,Swift Analytical LTd,H&M Foundation,Yoox Net-a-Porter Group,JESMOND ENGINEERING,JESMOND ENGINEERING,Wilson Biochemicals Ltd,RSA (Royal Society for Arts),Oxfam GB,Arcade Ltd,Business Growth Hub,Arcade Ltd,Pentland Brands,Henry Royce Institute,Circular Systems,RAFC,Kiosk N1C,Kiosk N1C,REGEMAT 3D SL,EPSRC Future Composites ManufacturingHub,ON ROAD,THP,LMB Textile Recycling,REGEMAT 3D SL,Technical Fibre Products Ltd,Vireol Bio Industries plc,Ultraleap,Presca Teamwear,ON ROAD,Materials and Design Exchange,SharpEnd,University of Portsmouth,The Royal Society of Arts (RSA),University of Innsbruck,Manor Farms,Materials and Design ExchangeFunder: UK Research and Innovation Project Code: EP/V011766/1Funder Contribution: 4,436,880 GBPThe current global fashion supply chain is characterised by its lack of transparency, forced labour, poor working conditions, unequal power relationships and overproduction caused by fast fashion. Lacking ethics, the global fashion supply chain is also highly polluting. The total footprint of clothing in use in the UK, including global and territorial emissions, was 26.2 million tonnes CO2 in 2016, up from 24 million tonnes in 2012 (equivalent to over a third of household transport emissions). The Textiles Circularity Centre (TCC) proposes materials security for the UK by circularising resource flows of textiles. This will stimulate innovation and economic growth in the UK textile manufacturing, SME apparel and creative technology sectors, whilst reducing reliance on imported and environmentally and ethically impactful materials, and diversifying supply chains. The TCC will provide underpinning research understanding to enable the transition to a more circular economy that supports the brand 'designed and made in the UK'. To enact this vision, we will catalyse growth in the fashion and textiles sector by supporting the SME fashion-apparel community with innovations in materials and product manufacturing, access to circular materials through supply chain design, and consumer experiences. Central to our approach is to enable consumers to be agents of change by engaging them in new cultures of consumption. We will effect a symbiosis between novel materials manufacturing and agentive consumer experiences through a supply chain design comprised of innovative business models and digital tools. Using lab-proven biotechnology, we will transform bio-based waste-derived feedstock (post-consumer textiles, crop residues, municipal solid waste) into renewable polymers, fibres and flexible textile materials, as part of a CE transition strategy to replace imported cotton, wood pulp and synthetic polyester fibres and petrochemical finishes. We will innovate advanced manufacturing techniques that link biorefining of organic waste, 3D weaving, robotics and additive manufacturing to circular design and produce flexible continuous textiles and three-dimensional textile forms for apparel products. These techniques will enable manufacturing hubs to be located on the high street or in local communities, and will support SME apparel brands and retailers to offer on-site/on-demand manufacture of products for local customisation. These hubs would generate regional cultural and social benefits through business and related skills development. We will design a transparent supply chain for these textiles through industrial symbiosis between waste management, farming, bio-refinery, textile production, SME apparel brands, and consumer stakeholders. Apparel brands will access this supply chain through our digital 'Biomaterials Platform', through which they can access the materials and data on their provenance, properties, circularity, and life cycle extension strategies. Working with SME apparel brands, we will develop an in-store Configurator and novel affective and creative technologies to engage consumers in digitally immersive experiences and services that amplify couplings between the resource flow, human well being and satisfaction, thus creating a new culture of consumption. This dematerialisation approach will necessitate innovation in business models that add value to the apparel, in order to counter overproduction and detachment. Consumers will become key nodes in the circular value chain, enabling responsible and personalised engagement. As a human-centred design led centre, TCC is uniquely placed to generate these innovations that will catalyse significant business and skills growth in UK textile manufacturing, SME fashion-apparel, and creative technology sectors, and drastically reduce waste and carbon emissions, and environmental and ethical impacts for the textiles sector.
more_vert assignment_turned_in Project2022 - 2026Partners:Ultraleap, UltraleapUltraleap,UltraleapFunder: UK Research and Innovation Project Code: MR/W013576/1Funder Contribution: 772,637 GBPWhen we touch a physical object, a sequence of mechanical events occurs whereby vibration is transmitted via the hard and soft tissues of the hand. The signals generated during object manipulation are then transduced into neural signals via ascending sensory pathways that our brain interprets as touch. When combined with signals from our other senses, memories and expectations, this information forms our realisation of the physical and psychological worlds. With modern technology, it is possible to generate immersive environments with breath-taking graphics, yet touch technologies (also known as haptics) capable of realistically and unobtrusively emulating the sense of touch have only just began to emerge. This future leaders fellowship (FLF) aims to unlock new potential in non-contact touch technologies by holistically understand both the physical and psychophysical dimensions of ultrasound mid-air haptics. To that end, we will lead ground-breaking R&D across acoustics, biophysics, neuroscience and artificial intelligence (AI). Mid-air haptics refers to electronically controlled collections of ultrasound speakers (phased arrays) that collectively generate complex acoustic fields in 3D space that can be touched and felt with our bare hands. Holographic 3D objects and surfaces can therefore be "haptified" and interacted with in mid-air, without the need to wear or hold any specialised controllers; a feature particularly appreciated in public display interfaces to limit the spread of pathogens. Coupled with augmented and virtual reality solutions, the technology allows the design and remote collaboration scenarios that are often seen in Sci-Fi movies such as Iron Man and Minority Report. R&D in mid-air haptics has been accelerating in recent years, yet has almost exclusively focused on hardware advancements, acoustic signal processing, and human-computer interaction (HCI) use cases. We believe that the true potential of ultrasound mid-air haptics is still unexplored, an opportunity uniquely available to be exploited by this FLF. Current mid-air haptics displays, such as those commercialised by Ultraleap only target one type of touch receptors (mechanoreceptors), which limits the device expressivity. Biophysical models capturing how acoustic waves interact with the skin are at their infancy and are experimentally unverified. Generative and computational models connecting phased array output, acoustic focusing waves, skin vibrations, mechanoreceptors, and psychophysical experiences are absent. This fellowship will be the first to thread these together. We will study ultrasonic mid-air haptics from first principles (i.e., acoustics and biophysics) all the way to perception and neurocognition. We will understand how localised acoustic energy generates non-localised skin vibrations, how those vibrations activate different touch receptors in the skin, and how receptors encode information that our somatosensory system then understands as touch. Once the forward problem is pieced together, our aim is to use machine learning to construct generative AI models enabling us to solve the inverse problem. What input ultrasound signals should be used to create the tactile sensation of holding a high-quality piece of paper? Today, there is no scientific way of answering such a question, even if we know that something like this is possible. Being able to bridge the different scientific fields related to ultrasonic mid-air haptics to create a holistic understanding of holographic touch is uniquely enabled by this FLF application. This 4-year, full-time, reduced hours FLF will support a cross-disciplinary and agile team of 2 postdoctoral research associates (RAs) led by the fellow, while being hosted at the only company in the world that is commercialising mid-air haptics, thus providing the fellowship with access to unique resources, engineering insights, and a direct pathway to economic and societal impact.
more_vert
chevron_left - 1
- 2
chevron_right