
NHS Lothian
NHS Lothian
12 Projects, page 1 of 3
assignment_turned_in Project2019 - 2023 University of Edinburgh, University of London, NHS LothianFunder: UKRI Project Code: EP/R005257/1Funder Contribution: 3,852,990 GBPThe EPSRC IRC Proteus is made up of a group of world-leading scientists, engineers and clinicians. Interdisciplinarity is at our heart - we work across traditional boundaries linking together disciplines such as optical physics, chemical biology, biology and engineering to name but a few. The ambition and desire is to translate technologies to help patients - empowering clinicians to "see disease" in front of their eyes at the bedside and help them to make the right decisions and give the right treatments at the right time. This highly interdisciplinary collaboration driven by clinical need and pull, has led to the design, fabrication and testing in patients of a number of world-leading bedside-based technology platforms. Our technology platform combines advanced fibre optic technology (that can be readily be passed into the lung of patients) and highly sensitive detectors in association with highly sensitive fluorescent chemical reagents to diagnose disease. This allows clinicians to "view" inside the lung to detect bacteria or aberrant disease signatures of disease. Clinical pull: Intensive care unit (ICU) patients suffer high death and disability rates and are responsible for a disproportionate financial burden on the health service (the equivalent of 1% of USA GDP is spent on patients in intensive care). Potentially fatal lung complications are a common problem in ventilated ICU patients and doctors caring for these patients in the ICU face many challenges, often needing to make snap decisions without the information necessary to properly inform those decisions. The new technology platforms being developed by Proteus are helping doctors in the intensive care unit to make rapid and accurate diagnoses of patients, allowing them to direct and inform therapy and ensure patients get the right treatment, at the right time and quickly. Although our technology platforms have a focus at this time on being used in the intensive care unit, it is widely applicable to a wide range of lung conditions and other healthcare situations, such as bowel or pancreatic cancer. The next steps for the IRC are to take our technology into a new area in which different flavours of light can be used to diagnose disease - using the teams' highly advanced light sensors (that are able to count a single photon). In addition the proposal moves the IRC towards sustainability, creating a legacy from the EPSRC investment - accelerating the pathways to take new technology into patients, while developing commercial opportunities. In summary the EPSRC IRC Proteus has generated a new cohort of young interdisciplinary scientists trained in physical and biological sciences and engineering that have a full appreciation and practical experience of clinical translational and commercialisation pathways. They will be able to meet the challenges of converting advances in science and engineering into healthcare benefits with the development of a number of cutting-edge bedside technology platforms which will help doctors make rapid and accurate diagnoses. The team, in association with the partner Universities, have also begun to make major strides towards full sustainability of the IRC - making major impacts in the areas of clinical and commercial translation, with significant academic outputs and public engagement activities.
All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=ukri________::0ff49a55efd98606adf4ca688a556a7a&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.eumore_vert All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=ukri________::0ff49a55efd98606adf4ca688a556a7a&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.euassignment_turned_in Project2010 - 2012 University of Aberdeen, NHS Lothian, Clevercherry.comFunder: UKRI Project Code: EP/H042938/1Funder Contribution: 134,122 GBPParents whose baby is in a neonatal intensive care unit (NICU) usually are under a lot of stress. Much of this stress is unavoidable, but in some cases parents are under more stress than necessary because they do not understand what is happening to their baby. Although NICU medical staff of course do their best to keep parents informed, some parents may not fully understand the terminology used by doctors and nurses (and may be reluctant to admit this), and also some parents may not be able to physically visit the NICU and talk to medical staf because of other commitments such as caring for other children.In a PhD project associated with the EPSRC-funded BabyTalk project, we have developed a software system, BT-Family, which produces summaries of a baby's status for parents. BT-Family builds on the award-winning BabyLink parent-information system used in the Edinburgh NICU, primarily by using artificial intelligence and natural language generation technology to automatically analyse and summarise the information in the baby's electronic patient record.BT-Family has been developed in consultation with parents, but it has not actually been deployed and evaluated by parents; this was not possible in the time frame of the PhD project. The goal of this project is to enhance BT-Family and deploy it in the wild where parents of NICU babies can use it, and evaluate how useful it is and also find out how parents believe the system can be improved.Although our focus in this project is specifically on parents of NICU babies, if this project is successful we believe that our ideas can be generalised to other situations where a parent or carer is responsible for someone in hospital. We believe that providing better information to parents and carers can reduce stress in many contexts (not just NICU), and that this is a major opportunity to use advanced IT to enhance the quality of life of people in the unfortunate position of having a child or dependent in hospital.
All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=ukri________::6bed51dc4fd924f0801e0008f17b499c&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.eumore_vert All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=ukri________::6bed51dc4fd924f0801e0008f17b499c&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.euassignment_turned_in Project2017 - 2021 NHS Grampian, NHS Lothian, Articulate Instruments LtdFunder: UKRI Project Code: EP/P02338X/1Funder Contribution: 964,678 GBPSpeech Sound Disorders (SSDs) are the most common communication impairment in childhood; 16.5% of eight year olds have SSDs ranging from problems with only one of two speech sounds to speech that even family members struggle to understand. SSDs can occur in isolation or be part of disability such as Down syndrome, autism or cleft palate. In 2015, the James Lind Alliance identified improving communication skills and investigating the direction of interventions as the top two research priorities for children with disabilities. Our programme of research aims to fulfil this need by developing technology which will aid the assessment, diagnosis and treatment of SSDs. Currently in Speech and Language Therapy, technological support is sparse. Through our previous work in the Ultrax project we showed that by using ultrasound to image the tongue in real-time, children can rapidly learn to produce speech sounds which have previously seemed impossible for them. Through this project, we developed technology that enhances the ultrasound image of the tongue, making it clearer and easier to interpret. Ultrax2020 aims to take this work forward, by further developing the ultrasound tongue tracker into a tool for diagnosing specific types of SSDs and evaluating how easy it is to use ultrasound in NHS clinics. The ultimate goal of our research is that Ultrax2020 will be used by Speech and Language Therapists (SLTs) to assess and diagnose SSDs automatically, leading to quicker, more targeted intervention. Normally speech assessment involves listening to the child and writing down what they say. This approach can miss important subtleties in the way children speak. For example, a child may try to say "key" and it may be heard as "tea". This leads the SLT to believe the child cannot tell the difference between t and k and select a therapy designed to tackle this. However, ultrasound allows us to view and measure the tongue, revealing that in many cases children are producing imperceptible errors. In the above example, an ultrasound scanner placed under the chin shows that the child produces both t and k simultaneously. Identification of these errors means that the SLT must choose a different therapy approach. However, ultrasound analysis is a time consuming task which can only be carried out by a speech scientist with specialist training. It is a key output of Ultrax2020 to develop a method for analysing ultrasound automatically, therefore creating a speech assessment tool which is both more objective and quicker to use. Building on the work of the Ultrax project, where we developed a method of tracking ultrasound images of the tongue, Ultrax2020 aims to develop a method of classifying tongue shapes to form the basis of an automatic assessment and a way of measuring progress objectively. We are fortunate to already have a large database of ultrasound images of tongue movements from adults and primary school children, including those with speech disorders, on which to base the model of tongue shape classification and to test its performance. At the same time, we will evaluate the technology we develop as part of Ultrax2020 by partnering with NHS SLTs to collect a very large database of ultrasound from children with a wide variety of SSDs. In three different NHS clinics, SLTs will record ultrasound from over 100 children before and after ultrasound-based speech therapy. This data will be sent to a university speech scientist for analysis and feedback to clinicians recommending intervention approaches. Towards the end of the project, we will be able to compare this gold-standard hand-labelled analysis with the automatic classification developed during the project. At the conclusion of our research project we will have developed and validated a new ultrasound assessment and therapy tool (Ultrax2020) for Speech and Language Therapists to use in the diagnosis and treatment of SSDs.
All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=ukri________::3a1c5d35356457dc94aab95cbcf26bbb&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.euvisibility 50visibility views 50 download downloads 379 Powered bymore_vert All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=ukri________::3a1c5d35356457dc94aab95cbcf26bbb&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.euassignment_turned_in Project2016 - 2021 NHS Lothian, Technology to Thrive, ALDEBARAN ROBOTICSFunder: UKRI Project Code: EP/N034546/1Funder Contribution: 722,509 GBPAutism Spectrum Disorder (ASD) affects 695,000 people in the UK, and about 547,000 of these are 18 or over (1.3% of the adults in working age). The unemployment rate among adults with an ASD is higher than 85%, nearly double the unemployment rate of 48% for the wider disabled population. One reason for this is that people with an ASD struggle to interpret social signals, those expressive behavioural cues through which people manifest what they feel or think (facial expressions, vocalisations, gestures, etc.). This project will develop a Socially-Competent Robot Training Buddy that will help adults with ASD to better deal with social signals in work-related scenarios. The project is inherently interdisciplinary and falls in the new research area of Socially Assistive Robotics, at the crossroads between robotics, psychology, and social signal processing. So far, autonomous robots have largely been seen as functionally engineered to carrying out well-defined tasks in an efficient manner. However Socially Assistive Robots (SARs) must fit into normal human social environments and follow interaction rules that do not disrupt an office or a home or upset their human interaction partners. This project will focus on high-functioning adults with an ASD, and just as physically assistive robots enable people to make movements that are difficult because of physical impairments, the SAR of this project enables people with an ASD to perform social tasks that are difficult - if not impossible - due to social cognition impairments. The main goal is to reduce the cost of Behavioural Skills Training (BST) through the development of a Robot Training Buddy. BST is recognized as one of the most effective approaches to alleviate ASD effects, but cannot be applied extensively because it is labour intensive. Using an autonomous robot would reduce the human effort and cost of BST and make it more widely available. The main technological challenge is the development of a novel affective architecture that makes a robot suitable for behaviour rehearsal, a critical stage of BST. In behaviour rehearsal, the robot must reinforce the use of appropriate social signals by its human interaction partner while inhibiting the use of inappropriate ones. The team will work with stakeholders involved with training for adults with an ASD to develop workplace-relevant scenarios in which to develop and evaluate the Training Buddy with end-users. This work will develop the necessary scientific basis for the introduction of socially-competent robots into human social environments, opening the way to a multitude of domestic, educational and assistive applications.
All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=ukri________::53ef07283e541b7db5a5fc7818848973&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.eumore_vert All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=ukri________::53ef07283e541b7db5a5fc7818848973&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.euassignment_turned_in Project2021 - 2023 University of Exeter, Massachusetts Institute of Technology, USA, RD&EFunder: UKRI Project Code: EP/V047868/1Funder Contribution: 202,450 GBPDetection of bowel cancer is currently performed by visual inspection of the colonic mucosa during endoscopy, which is less reliable for small-sized lesions that are not easily visualised. If they are not detected and removed at an early stage, there is a chance that they may become cancerous. This project seeks to develop a new mathematical tool for analysing the sensing capability of micro-robots to aid the detection of hard-to-visualise bowel lesions. Micro-robots experiencing vibrations, frictions, and impacts, known as non-smooth systems, exhibit a rich variety of different long-term behaviours co-existing for a given set of parameters, which is referred to as multi-stability or co-existing attractors. When the robot moves in the colon and encounters a lesion, some particular attractor may dominate its dynamics, while the other co-existing attractors could fade away due to the tissue's mechanical properties associated with different stages of malignant transformation. This significant change in multi-stability can be utilised to distinguish between healthy and abnormal tissues. The applicant proposes to use for the first time robot's multi-stability through the development of state-of-the-art numerical techniques to analyse such robot-lesion correlation, and produce a suite of computational analysis and advanced control methods for cancer detection and staging. In the long term, this work will initiate a new modality for bowel cancer screening, delivering an efficient minimally invasive procedure for patients. The unique research approach of this project, a joint effort of numerical and experimental studies, and the research collaboration with an applied mathematician and two NHS gastroenterologists, will secure a leading global position for the UK in applied non-smooth dynamics, micro-robots and early cancer diagnosis.
All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=ukri________::ce72a79b30e854c6f312c05104f4ca44&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.eumore_vert All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=ukri________::ce72a79b30e854c6f312c05104f4ca44&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.eu
