Powered by OpenAIRE graph
Found an issue? Give us feedback

Health & Social Care Information Centre

Health & Social Care Information Centre

7 Projects, page 1 of 2
  • Funder: UK Research and Innovation Project Code: EP/T022582/1
    Funder Contribution: 3,797,250 GBP

    The Centre for Digital Citizens (CDC) will address emerging challenges of digital citizenship, taking an inclusive, participatory approach to the design and evaluation of new technologies and services that support 'smart', 'data-rich' living in urban, rural and coastal communities. Core to the Centre's work will be the incubation of sustainable 'Digital Social Innovations' (DSI) that will ensure digital technologies support diverse end-user communities and will have long-lasting social value and impact beyond the life of the Centre. Our technological innovations will be co-created between academic, industrial, public and third sector partners, with citizens supporting co-creation and delivery of research. Through these activities, CDC will incubate user-led social innovation and sustainable impact for the Digital Economy (DE), at scale, in ways that have previously been difficult to achieve. The CDC will build on a substantial joint legacy and critical mass of DE funded research between Newcastle and Northumbria universities, developing the trajectory of work demonstrated in our highly successful Social Inclusion for the Digital Economy (SIDE) hub, our Digital Civics Centre for Doctoral Training and our Digital Economy Research Centre (DERC). The CDC is a response to recent research that has challenged simplified notions of the smart urban environment and its inhabitants, and highlighted the risks of emerging algorithmic and automated futures. The Centre will leverage our pioneering participatory design and co-creative research, our expertise in digital participatory platforms and data-driven technologies, to deliver new kinds of innovation for the DE, that empowers citizens. The CDC will focus on four critical Citizen Challenge areas arising from our prior work: 'The Well Citizen' addresses how use of shared personal data, and publicly available large-scale data, can inform citizens' self-awareness of personal health and wellbeing, of health inequalities, and of broader environmental and community wellbeing; 'The Safe Citizen' critically examines online and offline safety, including issues around algorithmic social justice and the role of new data technologies in supporting fair, secure and equitable societies; 'The Connected Citizen' explores next-generation citizen-led digital public services, which can support and sustain civic engagement and action in communities, and engagement in wider socio-political issues through new sustainable (openly managed) digital platforms; and 'The Ageless Citizen' investigates opportunities for technology-enhanced lifelong learning and opportunities for intergenerational engagement and technologies to support growth across an entire lifecourse. CDC pilot projects will be spread across the urban, rural and costal geography of the North East of England, embedded in communities with diverse socio-economic profiles and needs. Driving our programme to address these challenges is our 'Engaged Citizen Commissioning Framework'. This framework will support citizens' active engagement in the co-creation of research and critical inquiry. The framework will use design-led 'initiation mechanisms' (e.g. participatory design workshops, hackathons, community events, citizen labs, open innovation and co-production platform experiments) to support the co-creation of research activities. Our 'Innovation Fellows' (postdoctoral researchers) will engage in a 24-month social innovation programme within the CDC. They will pilot DSI projects as part of highly interdisciplinary, multi-stakeholder teams, including academics and end-users (e.g. Community Groups, NGO's, Charities, Government, and Industry partners). The outcome of these pilots will be the development of further collaborative bids (Research Council / Innovate UK / Charity / Industry funded), venture capital pitches, spin-outs and/or social enterprises. In this way the Centre will act as a catalyst for future innovation-focused DE activity.

    more_vert
  • Funder: UK Research and Innovation Project Code: ES/XX00018/1
    Funder Contribution: 445,000 GBP

    ADR UK (Administrative Data Research UK) is a partnership transforming the way researchers access the UK’s wealth of public sector data, to enable better informed policy decisions that improve people’s lives. By linking together data held by different parts of government, and by facilitating safe and secure access for accredited researchers to these newly joined-up data sets, ADR UK is creating a sustainable body of knowledge about how our society and economy function – tailored to give decision makers the answers they need to solve important policy questions. ADR UK is made up of three national partnerships (ADR Scotland, ADR Wales, and ADR NI) and the Office for National Statistics (ONS), which ensures data provided by UK government bodies is accessed by researchers in a safe and secure form with minimal risk to data holders or the public. The partnership is coordinated by a UK-wide Strategic Hub, which also promotes the benefits of administrative data research to the public and the wider research community, engages with UK government to secure access to data, and manages a dedicated research budget. ADR UK is funded by the Economic and Social Research Council (ESRC), part of UK Research and Innovation. To find out more, visit adruk.org or follow @ADR_UK on Twitter. ADR UK is funding the creation of a research-ready database linking health, education and social care data for all children in England for the first time. ECHILD stands for Education and Child Health Insights from Linked Data. The study involves the linking of around 14 million children’s records, which will be used to better understand how education affects children’s health and how health affects children’s education. The ECHILD project is led by University College London in collaboration with the London School of Hygiene & Tropical Medicine and the Institute for Fiscal Studies, in partnership with NHS Digital and the Department for Education, working with the Office for National Statistics (ONS).

    more_vert
  • Funder: UK Research and Innovation Project Code: EP/S023283/1
    Funder Contribution: 7,843,810 GBP

    The UKRI CDT in Artificial Intelligence (AI) for Healthcare will be the world's leading centre for PhD training of the next-generation innovators in AI applied to Healthcare. There is a unique role for AI in healthcare by providing more accurate decisions faster while reducing cost and suffering across society. AI in healthcare needs and drives current AI research avenues such as interpretable AI, privacy-preserving learning, trust in AI, data-efficient learning and safety in autonomy. These are key due to the immediate impact on life and health for users depending on AI for healthcare support. Healthcare applications require many AI specialists that can apply their skills in this heavily regulated domain. To address this need, we propose to train in total 90+ PhD students including 16 clinical PhD Fellows in five cohorts of 18+ PhDs, which will establish a new generation of cognitively diverse AI researchers with backgrounds ranging from computer science, psychology to design engineering and clinical medicine. The CDT focus areas arise from our early engagement in AI research and collaboration with clinicians, partnered technology companies and patient organisations, reflecting the healthcare areas of the UK industrial strategy. The Centre is grouped into 4 complementary healthcare themes and 4 cross-cutting AI expertise streams. The 4 healthcare themes are: (1) Productivity in Care: making healthcare provision more efficient and effective by increasing the productivity of doctors and nurses; (2) Diagnostics & Monitoring: developing AI-based diagnostics & monitoring that can detect disease earlier and monitor health with more precision; (3) Decision support systems: AI-based decision support systems that will support e.g. freeing up doctors' time to focus on the patient or can accelerate the development of novels drugs and treatments and empowering patients to be active agents within the decision-making by explaining, and (4) Biomedical discovery: driven by AI that accelerates drug discovery and linking genome, microbiome and environment data to discover novel disease mechanisms and treatment pathways. The themes are linked by 4 cross-cutting AI expertise streams: a. Perceptual AI technology enables to perceive, structure, and recognise from sensory data clinically relevant information. b. Cognitive AI technology mimics the reasoning, i.e. cognitive process, of healthcare specialists. c. Assistive AI technology supports clinicians with decision making as well as patients directly d. Underpinning AI technologies are driving factors for clinical and patient-focused AI innovations and will be enabling AI methodologies to operate beyond the currently possible. Our unique cohorts will benefit from an integrated training program and co-creation process with industry and patient organisations. PhD training is split into three phases that provide underpinning skill training (Foundation phase), research training (Research Phase) and finally drive PhD impact (Impact phase). During the Impact phase, the students will either (1) commercialising their research through a mentored start-up route (incubator partners), (2) deploying their technology in a clinical trial (two NIHR biomedical research centre (BRC) partners), or (3) testing their work in person through an NHS honorary contract (three NHS trusts as partners). Bespoke training will be created, such as AI bias & ethics, security, trust, inclusivity, differential privacy, transparency, accessibility and usability, service design, global inclusivity, healthcare treatments, clinical statistics and data regulation, Healthcare technology regulation, and technology commercialisation. We offer an exit Strategy (month 9-12) through a master's degree. The centre will place special emphasis on research that explores diversity in AI for healthcare research, including services to underserved communities and minority-specific care requirements.

    more_vert
  • Funder: UK Research and Innovation Project Code: EP/S021612/1
    Funder Contribution: 6,684,740 GBP

    PhD projects will be organised in three central themes that represent the core of our programme. The themes are aligned to the strategic priorities of our NHS partners and the overall vision of the CDT: A. AI-enabled diagnostics or prognostics [lead; McKendry]. Deep learning - the subset of machine learning that is based on a network structure loosely inspired by the human brain - enables networks to learn features from clinical data automatically. This gives them the ability to model complex non-linear relationships and such AI methods have found application in clinical diagnosis using either parameters typically embedded in an electronic health record (like blood test results) or the images produced during radiographic exams or in digital pathology suites. This theme will help us create, initiate and deploy academic research projects centred on clinical use cases of direct applicability in the hospitals where our Centre is based. Example projects might include the detection of radiology abnormality; characterisation of tissues and tissue abnormality (e.g. cancer staging); or the serial monitoring of disease. B. AI-enabled operations [lead; Marshall] The proximity of our Centre to the end-users of health technology prompts a second focus, on the use of AI methods to optimise care processes and pathways. We will ensure that our projects are academically focused, but will seek to create new approaches to investigate and characterise the performance of hospitals systems and processes - such as the flow of patients through emergency departments, AI-enabled projects that might shorten time-to-treatment or cancer waits. This will be the most translationally focused theme, seeking to surface and address key use cases of the greatest academic interest. C. AI-enabled therapeutics [lead; Denaxas]. Our final theme is forward looking; the use of deep learning and other AI methods in therapeutic inference or even in a therapy itself. AI methods may be most applicable here in mental health, where deployment of 'talking therapies' is as efficacious through the internet or telephony as face-to-face; or in the development of 'avatar therapies' such as that recently proposed at UCL for hallucinations. But a wide variety of research projects are conceivable, including rehabilitation following stroke; or indeed the use of AI monitoring of radiological change as a proxy endpoint for drug trials. This theme will help us focus cutting-edge work in our Centre around such use cases and novel methodology. The UK leads in the development of artificial intelligence technologies, investing around $850M between 2012-16, the third highest of any country. This has catalysed significant UK involvement of major global technology companies such as Alphabet and Apple, the creation of new UK-based AI companies such as Benevolent AI and DeepMind (both partners in our Centre) and the emergence of a vibrant UK SME community. 80% of AI companies on the UK Top 50 list are based in London, most with 30 minutes travel from UCL. Many of the most successful AI companies now focus on the application of AI in health, but the successful application of AI technologies such as deep learning has three key unmet needs; the identification of clinically relevant use cases, the availability of large quantities of high quality labelled data from NHS patients, and the availability of scientists and software engineers with the requisite algorithmic and programming skills. All three are addressed by our CDT, its novel NHS-embedded approach to training, linked to primary and social care and with close involvement of commercial partners, structured internships and leadership and entrepreneurship. This will create an entirely new cadre of individuals with both clinical knowledge and algorithmic/programming expertise, but also catalyse the creation and discovery of new large labelled datasets and exceptional clinical use cases informed by real-world clinical care.

    more_vert
  • Funder: UK Research and Innovation Project Code: EP/W011239/1
    Funder Contribution: 703,615 GBP

    Autonomous systems, such as medical systems, autonomous aerial and road vehicles, and manufacturing and agricultural robots, promise to extend and expand human capacities. But their benefits will only be harnessed if people have trust in the human processes around their design, development, and deployment. Enabling designers, engineers, developers, regulators, operators, and users to trace and allocate responsibility for the decisions, actions, failures, and outcomes of autonomous systems will be essential to this ecosystem of trust. If a self-driving car takes an action that affects you, you will want to know who is responsible for it and what are the channels for redress. If you are a doctor using an autonomous system in a clinical setting, you will want to understand the distribution of accountability between you, the healthcare organisation, and the developers of the system. Designers and engineers need clarity about what responsibilities fall on them, and when these transfer to other agents in the decision-making network. Manufacturers need to understand what they would be legally liable for. Mechanisms to achieve this transparency will not only provide all stakeholders with reassurance, they will also increase clarity, confidence, and competence amongst decision-makers. The research project is an interdisciplinary programme of work - drawing on the disciplines of engineering, law, and philosophy - that culminates in a methodology to achieve precisely that tracing and allocation of responsibility. By 'tracing responsibility' we mean the process of tracking the autonomous system's decisions or outcomes back to the decisions of designers, engineers, or operators, and understanding what led to the outcome. By 'allocating responsibility' we mean both allocating role responsibilities to different agents across the life-cycle and working out in advance who would be legally liable and morally responsible for different system decisions and outcomes once they have occurred. This methodology will facilitate responsibility-by-design and responsibility-through-lifecycle. In practice, the tracing and allocation of responsibility for the decisions and outcomes of AS is very complex. The complexity of the systems and the constant movement and unpredictability of their operational environments makes individual causal contributions difficult to distinguish. When this is combined with the fact that we delegate tasks to systems that require ethical judgement and lawful behaviour in human beings, it also gives rise to potential moral and legal responsibility gaps. The more complex and autonomous the system is, the more significant the role that assurance will play in tracing and allocating responsibility, especially in contexts that are technically and organisationally complex. The research project tackles these challenges head on. First, we clarify the fundamental concepts of responsibility, the different kinds of responsibility in play, the different agents involved, and where 'responsibility gaps' arise and how they can be addressed. Second, we build on techniques used in the technical assurance of high-risk systems to reason about responsibility in the context of uncertainty and dynamism, and therefore unpredictable socio-technical environments. Together, these strands of work provide the basis for a methodology for responsibility-by-design and responsibility-through-lifecycle that can be used in practice by a wide range of stakeholders. Assurance of responsibility will be achieved that not only identifies which agents are responsible for which outcomes and in what way throughout the lifecycle, and explains how this identification is achieved, but also establishes why this tracing and allocation of responsibility is well-justified and complete.

    more_vert
  • chevron_left
  • 1
  • 2
  • chevron_right

Do the share buttons not appear? Please make sure, any blocking addon is disabled, and then reload the page.

Content report
No reports available
Funder report
No option selected
arrow_drop_down

Do you wish to download a CSV file? Note that this process may take a while.

There was an error in csv downloading. Please try again later.