search
1,030 Projects, page 1 of 103

  • 2017-2021
  • UK Research and Innovation
  • UKRI|EPSRC
  • 2015

10
arrow_drop_down
  • Funder: UKRI Project Code: EP/M022234/1
    Funder Contribution: 99,750 GBP
    Partners: SEVERN TRENT WATER, NTU

    The wastewater treatment process (WWTP) plays a critical role in providing clean water. However, emerging and predominately unregulated, bioactive chemicals such as steroids and pharmaceutical drugs are being increasingly detected in surface waters that receive wastewater effluent. Although present at low concentrations, their inherent bioactive nature has been linked to abnormalities in aquatic organisms and there are also water reuse and human health implications. As part of the urban water cycle, the WWTP is the gatekeeper to the surface waters e.g. rivers. Pharmaceuticals enter wastewater treatment from inappropriate disposal of unused drugs to the sink/toilet or via landfill. Prescribed or illicit drug use also has the inevitable consequence of being metabolised in the human body (to parent, Phase I / II metabolites) and excreted in urine, which subsequently enters the WWTP. Coupled with naturally produced and excreted bioactive steroids, the challenge for wastewater treatment is that it was never designed to remove these bioactive chemicals and is inefficient. Evaluating the prevalence and fate of a steroid or pharmaceutical in the WWTP is challenging as human enzymatic metabolism causes the bioactive chemical to exist in multiple forms - parent, Phase I and Phase II metabolites. Phase II metabolites predominate urine excretion and are the starting products entering the wastewater environment. They therefore act as the precursors to the biotransformations that take place during treatment and produce the Phase I and/or parent forms of the bioactive chemical. Before treatment technologies can be developed and evaluated for pharmaceutical and steroid removal in the WWTP, our understanding needs to improve on how the different bioactive chemical forms behave, and their relationships to each other. This means identifying the biotransformations between metabolites and parent forms. To achieve this requires a move from targeted analysis - we analyse for what we expect to see - to develop methods that are non-targeted and search for Phase II metabolites and their associated Phase I / parent forms. Drawing on inspiration from metabolomics approaches used in the biosciences, the aim of this proposal is to develop a novel non-target method to identify bioactive chemical Phase II metabolites and their biotransformation products in wastewater. Knowledge of Phase II metabolite occurrence and fate in the wastewater environment is important in assessing the impact of user behaviour, process and environmental factors or bioactive chemical parent removal. This will inform on WWTP efficiency, provide data for optimising models that predict pharmaceuticals and steroids, and evaluate environmental risk.

  • Funder: UKRI Project Code: EP/N005422/1
    Funder Contribution: 307,997 GBP
    Partners: University of Cambridge

    A primary goal of organic chemists is the construction of molecules for applications as diverse as medicines, new materials and biomolecules. The field is constantly driven by the need for new, more efficient methods as well as ways to access molecules which may have previously been impossible. The most important tool at an organic chemist's disposal is undoubtedly catalysis, whereby the use of a small amount of a custom-designed catalyst can permit a reaction to occur under much milder conditions than otherwise, or opens up new chemical pathways altogether. For this reason, innovation in catalysis is central to innovation in organic chemistry. Nature's catalysis is performed by enzymes; evolution has made them phenomenally efficient. Often playing a leading role in enzymatic processes are 'hydrogen bonds', special types of electrostatic attraction which are important in facilitating the chemical reaction between two molecules by bringing them into close proximity with one another or by stabilising the pathway leading to product formation. My research seeks to employ these same interactions, but in the context of small molecules which we can readily synthesise and handle in the lab. This approach to catalysis is very exciting as it is still in its infancy yet offers exciting opportunities for both activation and control. This project will seek to take inspiration from a distinct field within chemistry called Supramolecular Chemistry, which explores the behavior of large molecules which are assembled from smaller ones using multiple weak 'temporary' interactions working in tandem. Hydrogen bonds are very important in this regard but there are a number of other key interactions such as ion pairs and pi-cation interactions which have been shown to be powerful in building up molecular structures. It is our aim to apply several of these interactions together in tandem to design new catalysts that will bind with our reactant in a very well defined orientation. The catalyst will also induce the substrate to react with another molecule, allowing the selective synthesis of one mirror image of a molecule over the other (so-called enantiomers). This is a very important pursuit in science, since the inherent 'handedness' of biological systems means that the different mirror image forms of chiral molecules often have very different effects in the body. This is of particular importance in pharmaceutical applications.

  • Funder: UKRI Project Code: EP/N014499/1
    Funder Contribution: 2,004,300 GBP
    Partners: University of Salford, University of Edinburgh, University of Liverpool, Durham University, Dudley Group of Hospitals NHS Trust, BioQ8 Ltd, Liverpool Health Partners, North West Coast Academic Health Sci Nwk, Carl Zeiss Ltd, PHE...

    As quality of life constantly improves, the average lifespan will continue to increase. Underlining this improvement is the vast amount of the UK government's support to NHS (£133.5 billion in year 2011/12) and the UK pharmaceutical industry's R&D large investment (4.9 billion to R&D in year 2011/12). The expectation of quality healthcare is inevitably high from all stakeholders. Fortunately recent advances in science and technology have enabled us to work towards personalised medicine and preventative care. This approach calls for a collective effort of researchers from a vast spectrum of specialised subjects. Advances in science and engineering is often accompanied by major development of mathematical sciences, as the latter underpin all other sciences. The UoL Centre will consist of a large and multidisciplinary team of applied and pure mathematicians, and statisticians together with healthcare researchers, clinicians and industrialists, collaborating with 15 HEIs and 40 NHS trusts plus other industrial partners and including our strongest groups: MRC Centre in Drug Safety Science, Centre for Cell imaging (CCI for live 3D and 4D imaging), Centre for Mathematical Imaging Techniques (unique in UK), Liverpool Biomedical EM unit, MRC Regenerative Medicine Hub, NIHR Health Protection Research Units, MRC Hub for Trials Methodology Research. Several research themes are highlighted below: Firstly, an improved understanding of the interaction dynamics of cells and tissues is crucial to developing effective future cures for cancer. Much of the current work is in 2D, with restrictive assumptions and without access to real data for modelling. We shall use the unparalleled real data of cell interactions in a 3D setting, generated at UoL's CCI. The real-life images obtained will have low contrast and noise and they will be analysed and enhanced by our imaging team through developing accurate and high resolution imaging models. The main imaging tools needed are segmentation methods (identifying objects such as cells and tissues regions in terms of sizes, shapes and precise boundaries). We shall propose and study a class of new 3D models, using our imaging data and analysis tools, to investigate and predict the spatial-temporal dynamics. Secondly, better models of how drugs are delivered to cells in tissues will improve personalised predictions of drug toxicity. We shall combine novel-imaging data of drug penetration into 3D experimental model systems with multi-scale mathematical models which scale-up from the level of cells to these model systems, with the ultimate aim of making better in-vitro to in-vivo predictions. Thirdly, there exist many competing models and software for imaging processing. However, for real images that have noise and are of low contrast, few methods are robust and accurate. To improve the modelling, applied and pure mathematicians team up to consider using more sophisticated tools of hyperbolic geometry and Riemann surfaces and fractional calculus to meet the demand for accuracy, and, applied mathematicians and statisticians will team up to design better data fidelity terms to model image discrepancies. Fourthly, resistance to current antibiotics means that previously treatable diseases are becoming deadly again. To understand and mitigate this, a better understanding is needed for how this resistance builds up across the human interaction networks and how it depends on antibiotic prescribing practices. To understand these scenarios, the mathematics competition in heterogeneous environments needs to be better understood. Our team links mathematical experts in analysing dynamical systems with experts in antimicrobial resistance and GPs to determine strategies that will mitigate or slow the development of anti-microbial resistance. Our research themes are aligned with, and will add value to, existing and current UoL and Research Council strategic investments, activities and future plans.

  • Funder: UKRI Project Code: EP/N014189/1
    Funder Contribution: 1,218,040 GBP
    Partners: University of Southampton

    The relentless growth of the amount, variety, availability, and the rate of change of data has profoundly transformed essentially all aspects of human life. The Big Data revolution has created a paradox: While we create and collect more data than ever before, it is not always easy to unlock the information it contains. To turn the easy availability of data into a major scientific and economic advantage, it is imperative that we create analytic tools that would be equal to the challenge presented by the complexity of modern data. In recent years, breakthroughs in topological data analysis and machine learning have paved the way for significant progress towards creating efficient and reliable tools to extract information from data. Our proposal has been designed to address the scope of the call as follows. To 'convert the vast amounts of data produced into understandable, actionable information' we will create a powerful fusion of machine learning, statistics, and topological data analysis. This combination of statistical insight, with computational power of machine learning with the flexibility, scalability, and visualisation tools of topology will allow a significant reduction of complexity of the data under study. The results will be output in a form that is best suited to the intended application or a scientific problem at hand. This way, we will create a seamless pathway from data analysis to implementation, which will allow us to control every step of this process. In particular, the intended end user will be able to query the results of the analysis to extract the information relevant to them. In summary, our work will provide tools to extract information from complex data sets to support user investigations or decisions. It is now well established that a main challenge of Big Data is how 'to efficiently and intelligently extract knowledge from heterogeneous, distributed data while retaining the context necessary for its interpretation'. This will be addressed first of all by developing techniques for dealing with heterogenous data. A main strength of topology is its ability to identify simple components in complex systems. It can also provide guiding principles on how to combine elements to create a model of a complex system. It also provides numerical techniques to control the overall shape of the resulting model to ensure that it fits with the original constraints. We will use the particular strengths of machine learning, statistics and topology to identify the main properties of data, which will then be combined to provide an overall analysis of the data. For example, a collection of text documents can be analysed using machine learning techniques to create a graph which captures similarities between documents in a topological way. This is an efficient way to classify a corpus of documents according to a desired set of keywords. An important part of our investigation will be to develop robust techniques of data fusion. This is important in many applications. One of our main applications will address the problem of creating a set of descriptors to diagnose and treat asthma. There are five main pathways for clinical diagnosis of asthma, each supported by data. To create a coherent picture of the disease we need to understand how to combine the information contained in these separate data sets to create the so called 'asthma handprint' which is a major challenge in this part of medicine. Every novel methodology of data analysis has to prove that its 'techniques are realistic, compatible and scalable with real- world services and hardware systems'. The best way to do that is to engage from the outset with challenging applications , and to ensure that theoretic and modelling solutions fit well the intended applications. We offer a unique synergy between theory and modelling as well as world-class facilities in medicine and chemistry which will provide a strict test for our ideas and results.

  • Funder: UKRI Project Code: EP/M02976X/1
    Funder Contribution: 335,616 GBP
    Partners: University of Surrey

    Many industrial processing operations depend on feed materials that are fine powders with poor handling characteristics, which have to be rectified by granulation to form coarser granules. Generally wet granulation is employed, in which a binder is added to the powder in a mixer usually in batch processes. Continuous Twin Screw Granulation (TSG) has considerable potential, eg in the pharmaceutical sector, because of the flexibility in throughput and equipment design, reproducibility, short residence times, smaller liquid/solid ratios and also the ability to granulate difficult to process formulations. However, there remain significant technical issues that limit its widespread use and a greater understanding of the process is required to meet regulatory requirements. Moreover, encapsulated APIs (Active Pharmaceutical Ingredients) are of increasing interest and the development of a TSG process that did not damage such encapsulates would significantly extend applications. Experimental optimisation of TSG is expensive and often sub-optimal because of the high costs of APIs and does not lead to a more generic understanding of the process. Computational modelling of the behaviour of individual feed particles during the process will overcome these limitations. The Distinct Element Method (DEM) is the most widely used method but has rarely been applied to the number of particles in a TSG extruder (~ 55 million) and such examples involve simplified interparticle interactions e.g. by assuming that the particles are smooth and spherical and any liquid is present as discrete bridges rather than the greater saturation states associated with granulation. The project will be based on a multiscale strategy to develop advanced interaction laws that are more representative of real systems. The bulk and interfacial properties of a swelling particulate binder such as microcrystalline cellulose will be modelled using Coarse-grained Molecular Dynamics to derive inputs into a meso-scale Finite Discrete Element Method model of formulations that include hard particles and a viscous polymeric binder (hydroxypropylcellulose). Elastic particles (e.g. lactose and encapsulates) with viscous binder formulations will be modelled using the Fast Multi-pole Boundary Element Method. These micro- and meso-scale models will be used to provide closure for a DEM model of TSG. It will involve collaboration with the Chinese Academy of Science, which has pioneered the application of massively parallel high performance computing with GPU clusters to discrete modelling such as DEM, albeit with existing simpler interaction laws. An extensive experimental programme will be deployed to measure physical inputs and validate the models. The screw design and operating conditions of TSG for the formulations considered will be optimised using DEM and the results validated empirically. Optimisation criteria will include the granule size distribution, the quality of tablets for granules produced from the lactose formulation and the minimisation of damage to encapsulates. The primary benefit will be to provide a modelling toolbox for TSG for enabling more rapid and cost-effective optimisation, and allow encapsulated APIs to be processed. Detailed data post-processing will elucidate mechanistic information that will be used to develop regime performance maps. The multiscale modelling will have applications to a wide range of multiphase systems as exemplified by a large fraction of consumer products, catalyst pastes for extrusion processes, and agriculture products such as pesticides. The micro- and mesoscopic methods have generic applications for studying the bulk and interfacial behaviour of hard and soft particles and also droplets in emulsions. The combination of advanced modelling and implementation on massively parallel high performance GPU clusters will allow unprecedented applications to multiphase systems of enormous complexity.

  • Funder: UKRI Project Code: EP/M02525X/1
    Funder Contribution: 341,698 GBP
    Partners: City, University of London

    The product rule of the all familiar operation of taking derivatives of real valued functions has a plethora of generalisations and applications in algebra. It leads to the notion of derivations of algebras - these are linear endomorphisms of an algebra satifying the product rule. They represent the classes of the first Hochschild cohomology of an algebra. The first Hochschild cohomology of an algebra turns out to be a Lie algebra, and more precisely, a restricted Lie algebra if the underlying base ring is a field of positive characteristic. The (restricted) Lie algebra structure extends to all positive degrees in Hochschild cohomology - this goes back to pioneering work of Gerstenhaber on defornations of algebras. Modular representation theory of finite groups seeks to understand the connections between the structure of finite groups and the associated group algebras. Many of the conjectures that drive this area are - to date mysterious - numerical coincidences relating invariants of finite group algebras to invariants of the underlying groups. The sophisticated cohomological technology hinted at in the previous paragraph is expected to yield some insight regarding these coincidences, and the present proposal puts the focus on some precise and unexplored invariance properties of certain groups of integrable derivations under Morita, derived, or stable equivalences between indecomposable algebra factors of finite group algebras, their character theory, their automorphism groups, and the local structure of finite groups.

  • Funder: UKRI Project Code: EP/M009718/1
    Funder Contribution: 100,454 GBP
    Partners: QMUL

    The theory of operator algebras goes back to Murray, von Neumann, Gelfand and Naimark. The original motivation was to provide a mathematical foundation for quantum mechanics. At the same time, from the very beginning of the subject, it was anticipated that operator algebras form very interesting structures on their own right and will have applications to unitary representations of groups and operator theory in Hilbert space. Actually, much more turned out to be true. After some dramatic and unexpected developments, the theory of operator algebras has established itself as a very active and highly interdisciplinary research area. Not only do there exist - as initially envisioned - strong connections to quantum physics as well as representation theory and operator theory, operator algebras nowadays have far reaching applications in various mathematical disciplines like functional analysis, algebra, geometric group theory, geometry, topology or dynamical systems. One of the most important classes of operator algebras is given by C*-algebras, which are defined as norm-closed, self-adjoint algebras of bounded linear operators on a Hilbert space. As in many areas in mathematics, advances in the theory of C*-algebras went hand in hand with the discovery of interesting and illuminating examples, the most prominent ones being group C*-algebras and C*-algebras attached to dynamical systems, so-called crossed products. The main objects of study in this research project are given by semigroup C*-algebras, which are natural generalizations of group C*-algebras. Our goal is to analyse the structure of semigroup C*-algebras and to use this construction as a tool to study groups and their subsemigroups from the point of view of geometric group theory. Closely related to this, this project also aims at a better understanding of the interplay between C*-algebras and dynamical systems. Our project lies at the frontier of current research. We take up recent advances in semigroup C*-algebras, classification of C*-algebras, the interplay between C*-algebras and symbolic dynamics, as well as the discovery of rigidity phenomena in operator algebras and dynamical systems. One of the key characteristics of our research project is its high interdisciplinary character. It lies at the interface of several research areas in mathematics and brings together expertise from different fields. This takes up the trend in mathematics that interactions between different branches are becoming more and more important. Therefore, the mathematical community as a whole benefits through an active and inspiring exchange of ideas.

  • Funder: UKRI Project Code: EP/M016005/1
    Funder Contribution: 302,791 GBP
    Partners: Three, TRTUK, University of London, VCE Mobile & Personal Comm Ltd

    Spectrum is a precious but scarce natural resource. In the UK, Ofcom will free up the analogue TV spectrum at 800MHz (together with the available 2.6GHz band) for 4G, which has already raised £2.34 billion for the national purse. According to Ofcom, the amount of data Britons consume on the move each month has already hit 20 million gigabytes, mainly due to users' engagement of video, TV and films while on the move. It is also a common understanding for the mobile operators that by 2020 a 1000 times increase in the system capacity will be needed to avoid mobile networks grinding to a halt. Maximising spectral efficiency, which is limited by interference and fading for wireless networks including 4G, is therefore a major issue. An emerging idea, which is championed by Alcatel-Lucent and has already received serious consideration by vendors and operators is that of a massive MIMO antenna system. This technology has the potential to unlock the issue of spectrum scarcity and to enhance spectrum usage tremendously by enabling simultaneous access of tens or hundreds of terminals in the same time-frequency resource. In order for massive MIMO technology to attain its utmost potential, it is important that various challenges in terms of channel estimation and acquisition due to pilot contamination, fast spatial-temporal variations in signal power and autonomous resource allocation, in particular in the presence of simultaneous access of a large number of users need to be addressed. The focus of this project is on tackling these fundamental challenges, by advancing aspects of information theory, estimation theory and network optimisations. In particular, we will contribute in terms of modelling massive MIMO channels underpinned by heterogeneous correlation structures; performing information theoretic analysis in terms of random matrix theory through shrinkage estimators; robust precoder design for massive MIMO in the presence of channel estimation errors; developing novel channel estimation technique in the presence of severe pilot contamination; and proposing and analysing game theoretic algorithms for autonomous resource allocation and pilot assignments. All the concepts and algorithms developed will be integrated and the radio link layer performance will be assessed using a simulation reference system based on LTE-Advanced standards and its evolution towards 5G. Industrial partners will be engaged throughout the project to ensure industrial relevance of our work.

  • Funder: UKRI Project Code: EP/M024512/1
    Funder Contribution: 244,299 GBP
    Partners: KCL

    While the theory of minimal and constant mean curvature (CMC) surfaces is a purely mathematical one, such surfaces overtly present themselves in nature and are studied in many material sciences. This makes the theory more exciting. If we take a closed wire and dip it in and out of soapy water, the soap film that forms across the loop is in fact a minimal surface and the physical properties of soap films were already studied by Plateau in the 1850s. The air pressure on the sides of soap films is equal and constant. However, if we change the pressure on one side, for instance by blowing air on it, the new surface that we obtain is what we call a soap bubble. A soap bubble is a CMC surface. More precisely, minimal and CMC surfaces are, respectively, mathematical idealisation of soap films and soap bubbles. The mean curvature of a soap film and bubble is a quantity that is proportional to the pressure difference on the sides of the film. The value of the pressure difference, and therefore of the mean curvature, is zero for a soap film/minimal surface and it is non-zero constant for a soap bubble/CMC surface. Since the pressure inside a small bubble is greater than the pressure inside a big one, the constant mean curvature of a small bubble is greater than the constant mean curvature of a big one. Minimal and CMC surfaces also enjoy crucial minimising properties relative to area. Among all surfaces spanning a given boundary, a soap film/minimal surface is one with locally least area; soap bubbles/CMC surfaces locally minimise area under a volume constraint. This project aims to investigate several key geometric properties of minimal and CMC surfaces. Roughly speaking, I intend to prove several results about CMC surfaces embedded in a flat three-dimensional manifold, including area estimates when the surfaces are compact with bounded genus and the ambient manifold is compact. I also plan to study the limits of a sequence of minimal or CMC surfaces embedded in a general three-dimensional manifold.

  • Funder: UKRI Project Code: EP/M020207/1
    Funder Contribution: 977,312 GBP
    Partners: BAM, MTC, RCNDE, BP British Petroleum, ROLLS-ROYCE PLC, Tenaris, University of Salford, Imperial College London

    If imaging required less data, it would enable faster throughput, improved performance in restricted access situations and simpler, cheaper hardware. The information from images enables damage to be accurately quantified within engineering components, avoiding the need to choose between excessive conservatism and unpredicted failures. To enable improved reconstructions from limited data sets, a diverse set of approaches have been identified, incorporating knowledge of physical wave interaction with objects, use of external information, image processing and other techniques. The fellowship will address the broad problem by applying these approaches to several example applications which are of great interest to industry, and will ultimately enable the development of the field of limited data imaging. While primarily focused on NDE (non-destructive evaluation), the applications of this spread to areas including medicine, geophysics and security.

search
1,030 Projects, page 1 of 103
  • Funder: UKRI Project Code: EP/M022234/1
    Funder Contribution: 99,750 GBP
    Partners: SEVERN TRENT WATER, NTU

    The wastewater treatment process (WWTP) plays a critical role in providing clean water. However, emerging and predominately unregulated, bioactive chemicals such as steroids and pharmaceutical drugs are being increasingly detected in surface waters that receive wastewater effluent. Although present at low concentrations, their inherent bioactive nature has been linked to abnormalities in aquatic organisms and there are also water reuse and human health implications. As part of the urban water cycle, the WWTP is the gatekeeper to the surface waters e.g. rivers. Pharmaceuticals enter wastewater treatment from inappropriate disposal of unused drugs to the sink/toilet or via landfill. Prescribed or illicit drug use also has the inevitable consequence of being metabolised in the human body (to parent, Phase I / II metabolites) and excreted in urine, which subsequently enters the WWTP. Coupled with naturally produced and excreted bioactive steroids, the challenge for wastewater treatment is that it was never designed to remove these bioactive chemicals and is inefficient. Evaluating the prevalence and fate of a steroid or pharmaceutical in the WWTP is challenging as human enzymatic metabolism causes the bioactive chemical to exist in multiple forms - parent, Phase I and Phase II metabolites. Phase II metabolites predominate urine excretion and are the starting products entering the wastewater environment. They therefore act as the precursors to the biotransformations that take place during treatment and produce the Phase I and/or parent forms of the bioactive chemical. Before treatment technologies can be developed and evaluated for pharmaceutical and steroid removal in the WWTP, our understanding needs to improve on how the different bioactive chemical forms behave, and their relationships to each other. This means identifying the biotransformations between metabolites and parent forms. To achieve this requires a move from targeted analysis - we analyse for what we expect to see - to develop methods that are non-targeted and search for Phase II metabolites and their associated Phase I / parent forms. Drawing on inspiration from metabolomics approaches used in the biosciences, the aim of this proposal is to develop a novel non-target method to identify bioactive chemical Phase II metabolites and their biotransformation products in wastewater. Knowledge of Phase II metabolite occurrence and fate in the wastewater environment is important in assessing the impact of user behaviour, process and environmental factors or bioactive chemical parent removal. This will inform on WWTP efficiency, provide data for optimising models that predict pharmaceuticals and steroids, and evaluate environmental risk.

  • Funder: UKRI Project Code: EP/N005422/1
    Funder Contribution: 307,997 GBP
    Partners: University of Cambridge

    A primary goal of organic chemists is the construction of molecules for applications as diverse as medicines, new materials and biomolecules. The field is constantly driven by the need for new, more efficient methods as well as ways to access molecules which may have previously been impossible. The most important tool at an organic chemist's disposal is undoubtedly catalysis, whereby the use of a small amount of a custom-designed catalyst can permit a reaction to occur under much milder conditions than otherwise, or opens up new chemical pathways altogether. For this reason, innovation in catalysis is central to innovation in organic chemistry. Nature's catalysis is performed by enzymes; evolution has made them phenomenally efficient. Often playing a leading role in enzymatic processes are 'hydrogen bonds', special types of electrostatic attraction which are important in facilitating the chemical reaction between two molecules by bringing them into close proximity with one another or by stabilising the pathway leading to product formation. My research seeks to employ these same interactions, but in the context of small molecules which we can readily synthesise and handle in the lab. This approach to catalysis is very exciting as it is still in its infancy yet offers exciting opportunities for both activation and control. This project will seek to take inspiration from a distinct field within chemistry called Supramolecular Chemistry, which explores the behavior of large molecules which are assembled from smaller ones using multiple weak 'temporary' interactions working in tandem. Hydrogen bonds are very important in this regard but there are a number of other key interactions such as ion pairs and pi-cation interactions which have been shown to be powerful in building up molecular structures. It is our aim to apply several of these interactions together in tandem to design new catalysts that will bind with our reactant in a very well defined orientation. The catalyst will also induce the substrate to react with another molecule, allowing the selective synthesis of one mirror image of a molecule over the other (so-called enantiomers). This is a very important pursuit in science, since the inherent 'handedness' of biological systems means that the different mirror image forms of chiral molecules often have very different effects in the body. This is of particular importance in pharmaceutical applications.

  • Funder: UKRI Project Code: EP/N014499/1
    Funder Contribution: 2,004,300 GBP
    Partners: University of Salford, University of Edinburgh, University of Liverpool, Durham University, Dudley Group of Hospitals NHS Trust, BioQ8 Ltd, Liverpool Health Partners, North West Coast Academic Health Sci Nwk, Carl Zeiss Ltd, PHE...

    As quality of life constantly improves, the average lifespan will continue to increase. Underlining this improvement is the vast amount of the UK government's support to NHS (£133.5 billion in year 2011/12) and the UK pharmaceutical industry's R&D large investment (4.9 billion to R&D in year 2011/12). The expectation of quality healthcare is inevitably high from all stakeholders. Fortunately recent advances in science and technology have enabled us to work towards personalised medicine and preventative care. This approach calls for a collective effort of researchers from a vast spectrum of specialised subjects. Advances in science and engineering is often accompanied by major development of mathematical sciences, as the latter underpin all other sciences. The UoL Centre will consist of a large and multidisciplinary team of applied and pure mathematicians, and statisticians together with healthcare researchers, clinicians and industrialists, collaborating with 15 HEIs and 40 NHS trusts plus other industrial partners and including our strongest groups: MRC Centre in Drug Safety Science, Centre for Cell imaging (CCI for live 3D and 4D imaging), Centre for Mathematical Imaging Techniques (unique in UK), Liverpool Biomedical EM unit, MRC Regenerative Medicine Hub, NIHR Health Protection Research Units, MRC Hub for Trials Methodology Research. Several research themes are highlighted below: Firstly, an improved understanding of the interaction dynamics of cells and tissues is crucial to developing effective future cures for cancer. Much of the current work is in 2D, with restrictive assumptions and without access to real data for modelling. We shall use the unparalleled real data of cell interactions in a 3D setting, generated at UoL's CCI. The real-life images obtained will have low contrast and noise and they will be analysed and enhanced by our imaging team through developing accurate and high resolution imaging models. The main imaging tools needed are segmentation methods (identifying objects such as cells and tissues regions in terms of sizes, shapes and precise boundaries). We shall propose and study a class of new 3D models, using our imaging data and analysis tools, to investigate and predict the spatial-temporal dynamics. Secondly, better models of how drugs are delivered to cells in tissues will improve personalised predictions of drug toxicity. We shall combine novel-imaging data of drug penetration into 3D experimental model systems with multi-scale mathematical models which scale-up from the level of cells to these model systems, with the ultimate aim of making better in-vitro to in-vivo predictions. Thirdly, there exist many competing models and software for imaging processing. However, for real images that have noise and are of low contrast, few methods are robust and accurate. To improve the modelling, applied and pure mathematicians team up to consider using more sophisticated tools of hyperbolic geometry and Riemann surfaces and fractional calculus to meet the demand for accuracy, and, applied mathematicians and statisticians will team up to design better data fidelity terms to model image discrepancies. Fourthly, resistance to current antibiotics means that previously treatable diseases are becoming deadly again. To understand and mitigate this, a better understanding is needed for how this resistance builds up across the human interaction networks and how it depends on antibiotic prescribing practices. To understand these scenarios, the mathematics competition in heterogeneous environments needs to be better understood. Our team links mathematical experts in analysing dynamical systems with experts in antimicrobial resistance and GPs to determine strategies that will mitigate or slow the development of anti-microbial resistance. Our research themes are aligned with, and will add value to, existing and current UoL and Research Council strategic investments, activities and future plans.

  • Funder: UKRI Project Code: EP/N014189/1
    Funder Contribution: 1,218,040 GBP
    Partners: University of Southampton

    The relentless growth of the amount, variety, availability, and the rate of change of data has profoundly transformed essentially all aspects of human life. The Big Data revolution has created a paradox: While we create and collect more data than ever before, it is not always easy to unlock the information it contains. To turn the easy availability of data into a major scientific and economic advantage, it is imperative that we create analytic tools that would be equal to the challenge presented by the complexity of modern data. In recent years, breakthroughs in topological data analysis and machine learning have paved the way for significant progress towards creating efficient and reliable tools to extract information from data. Our proposal has been designed to address the scope of the call as follows. To 'convert the vast amounts of data produced into understandable, actionable information' we will create a powerful fusion of machine learning, statistics, and topological data analysis. This combination of statistical insight, with computational power of machine learning with the flexibility, scalability, and visualisation tools of topology will allow a significant reduction of complexity of the data under study. The results will be output in a form that is best suited to the intended application or a scientific problem at hand. This way, we will create a seamless pathway from data analysis to implementation, which will allow us to control every step of this process. In particular, the intended end user will be able to query the results of the analysis to extract the information relevant to them. In summary, our work will provide tools to extract information from complex data sets to support user investigations or decisions. It is now well established that a main challenge of Big Data is how 'to efficiently and intelligently extract knowledge from heterogeneous, distributed data while retaining the context necessary for its interpretation'. This will be addressed first of all by developing techniques for dealing with heterogenous data. A main strength of topology is its ability to identify simple components in complex systems. It can also provide guiding principles on how to combine elements to create a model of a complex system. It also provides numerical techniques to control the overall shape of the resulting model to ensure that it fits with the original constraints. We will use the particular strengths of machine learning, statistics and topology to identify the main properties of data, which will then be combined to provide an overall analysis of the data. For example, a collection of text documents can be analysed using machine learning techniques to create a graph which captures similarities between documents in a topological way. This is an efficient way to classify a corpus of documents according to a desired set of keywords. An important part of our investigation will be to develop robust techniques of data fusion. This is important in many applications. One of our main applications will address the problem of creating a set of descriptors to diagnose and treat asthma. There are five main pathways for clinical diagnosis of asthma, each supported by data. To create a coherent picture of the disease we need to understand how to combine the information contained in these separate data sets to create the so called 'asthma handprint' which is a major challenge in this part of medicine. Every novel methodology of data analysis has to prove that its 'techniques are realistic, compatible and scalable with real- world services and hardware systems'. The best way to do that is to engage from the outset with challenging applications , and to ensure that theoretic and modelling solutions fit well the intended applications. We offer a unique synergy between theory and modelling as well as world-class facilities in medicine and chemistry which will provide a strict test for our ideas and results.

  • Funder: UKRI Project Code: EP/M02976X/1
    Funder Contribution: 335,616 GBP
    Partners: University of Surrey

    Many industrial processing operations depend on feed materials that are fine powders with poor handling characteristics, which have to be rectified by granulation to form coarser granules. Generally wet granulation is employed, in which a binder is added to the powder in a mixer usually in batch processes. Continuous Twin Screw Granulation (TSG) has considerable potential, eg in the pharmaceutical sector, because of the flexibility in throughput and equipment design, reproducibility, short residence times, smaller liquid/solid ratios and also the ability to granulate difficult to process formulations. However, there remain significant technical issues that limit its widespread use and a greater understanding of the process is required to meet regulatory requirements. Moreover, encapsulated APIs (Active Pharmaceutical Ingredients) are of increasing interest and the development of a TSG process that did not damage such encapsulates would significantly extend applications. Experimental optimisation of TSG is expensive and often sub-optimal because of the high costs of APIs and does not lead to a more generic understanding of the process. Computational modelling of the behaviour of individual feed particles during the process will overcome these limitations. The Distinct Element Method (DEM) is the most widely used method but has rarely been applied to the number of particles in a TSG extruder (~ 55 million) and such examples involve simplified interparticle interactions e.g. by assuming that the particles are smooth and spherical and any liquid is present as discrete bridges rather than the greater saturation states associated with granulation. The project will be based on a multiscale strategy to develop advanced interaction laws that are more representative of real systems. The bulk and interfacial properties of a swelling particulate binder such as microcrystalline cellulose will be modelled using Coarse-grained Molecular Dynamics to derive inputs into a meso-scale Finite Discrete Element Method model of formulations that include hard particles and a viscous polymeric binder (hydroxypropylcellulose). Elastic particles (e.g. lactose and encapsulates) with viscous binder formulations will be modelled using the Fast Multi-pole Boundary Element Method. These micro- and meso-scale models will be used to provide closure for a DEM model of TSG. It will involve collaboration with the Chinese Academy of Science, which has pioneered the application of massively parallel high performance computing with GPU clusters to discrete modelling such as DEM, albeit with existing simpler interaction laws. An extensive experimental programme will be deployed to measure physical inputs and validate the models. The screw design and operating conditions of TSG for the formulations considered will be optimised using DEM and the results validated empirically. Optimisation criteria will include the granule size distribution, the quality of tablets for granules produced from the lactose formulation and the minimisation of damage to encapsulates. The primary benefit will be to provide a modelling toolbox for TSG for enabling more rapid and cost-effective optimisation, and allow encapsulated APIs to be processed. Detailed data post-processing will elucidate mechanistic information that will be used to develop regime performance maps. The multiscale modelling will have applications to a wide range of multiphase systems as exemplified by a large fraction of consumer products, catalyst pastes for extrusion processes, and agriculture products such as pesticides. The micro- and mesoscopic methods have generic applications for studying the bulk and interfacial behaviour of hard and soft particles and also droplets in emulsions. The combination of advanced modelling and implementation on massively parallel high performance GPU clusters will allow unprecedented applications to multiphase systems of enormous complexity.

  • Funder: UKRI Project Code: EP/M02525X/1
    Funder Contribution: 341,698 GBP
    Partners: City, University of London

    The product rule of the all familiar operation of taking derivatives of real valued functions has a plethora of generalisations and applications in algebra. It leads to the notion of derivations of algebras - these are linear endomorphisms of an algebra satifying the product rule. They represent the classes of the first Hochschild cohomology of an algebra. The first Hochschild cohomology of an algebra turns out to be a Lie algebra, and more precisely, a restricted Lie algebra if the underlying base ring is a field of positive characteristic. The (restricted) Lie algebra structure extends to all positive degrees in Hochschild cohomology - this goes back to pioneering work of Gerstenhaber on defornations of algebras. Modular representation theory of finite groups seeks to understand the connections between the structure of finite groups and the associated group algebras. Many of the conjectures that drive this area are - to date mysterious - numerical coincidences relating invariants of finite group algebras to invariants of the underlying groups. The sophisticated cohomological technology hinted at in the previous paragraph is expected to yield some insight regarding these coincidences, and the present proposal puts the focus on some precise and unexplored invariance properties of certain groups of integrable derivations under Morita, derived, or stable equivalences between indecomposable algebra factors of finite group algebras, their character theory, their automorphism groups, and the local structure of finite groups.

  • Funder: UKRI Project Code: EP/M009718/1
    Funder Contribution: 100,454 GBP
    Partners: QMUL

    The theory of operator algebras goes back to Murray, von Neumann, Gelfand and Naimark. The original motivation was to provide a mathematical foundation for quantum mechanics. At the same time, from the very beginning of the subject, it was anticipated that operator algebras form very interesting structures on their own right and will have applications to unitary representations of groups and operator theory in Hilbert space. Actually, much more turned out to be true. After some dramatic and unexpected developments, the theory of operator algebras has established itself as a very active and highly interdisciplinary research area. Not only do there exist - as initially envisioned - strong connections to quantum physics as well as representation theory and operator theory, operator algebras nowadays have far reaching applications in various mathematical disciplines like functional analysis, algebra, geometric group theory, geometry, topology or dynamical systems. One of the most important classes of operator algebras is given by C*-algebras, which are defined as norm-closed, self-adjoint algebras of bounded linear operators on a Hilbert space. As in many areas in mathematics, advances in the theory of C*-algebras went hand in hand with the discovery of interesting and illuminating examples, the most prominent ones being group C*-algebras and C*-algebras attached to dynamical systems, so-called crossed products. The main objects of study in this research project are given by semigroup C*-algebras, which are natural generalizations of group C*-algebras. Our goal is to analyse the structure of semigroup C*-algebras and to use this construction as a tool to study groups and their subsemigroups from the point of view of geometric group theory. Closely related to this, this project also aims at a better understanding of the interplay between C*-algebras and dynamical systems. Our project lies at the frontier of current research. We take up recent advances in semigroup C*-algebras, classification of C*-algebras, the interplay between C*-algebras and symbolic dynamics, as well as the discovery of rigidity phenomena in operator algebras and dynamical systems. One of the key characteristics of our research project is its high interdisciplinary character. It lies at the interface of several research areas in mathematics and brings together expertise from different fields. This takes up the trend in mathematics that interactions between different branches are becoming more and more important. Therefore, the mathematical community as a whole benefits through an active and inspiring exchange of ideas.

  • Funder: UKRI Project Code: EP/M016005/1
    Funder Contribution: 302,791 GBP
    Partners: Three, TRTUK, University of London, VCE Mobile & Personal Comm Ltd

    Spectrum is a precious but scarce natural resource. In the UK, Ofcom will free up the analogue TV spectrum at 800MHz (together with the available 2.6GHz band) for 4G, which has already raised £2.34 billion for the national purse. According to Ofcom, the amount of data Britons consume on the move each month has already hit 20 million gigabytes, mainly due to users' engagement of video, TV and films while on the move. It is also a common understanding for the mobile operators that by 2020 a 1000 times increase in the system capacity will be needed to avoid mobile networks grinding to a halt. Maximising spectral efficiency, which is limited by interference and fading for wireless networks including 4G, is therefore a major issue. An emerging idea, which is championed by Alcatel-Lucent and has already received serious consideration by vendors and operators is that of a massive MIMO antenna system. This technology has the potential to unlock the issue of spectrum scarcity and to enhance spectrum usage tremendously by enabling simultaneous access of tens or hundreds of terminals in the same time-frequency resource. In order for massive MIMO technology to attain its utmost potential, it is important that various challenges in terms of channel estimation and acquisition due to pilot contamination, fast spatial-temporal variations in signal power and autonomous resource allocation, in particular in the presence of simultaneous access of a large number of users need to be addressed. The focus of this project is on tackling these fundamental challenges, by advancing aspects of information theory, estimation theory and network optimisations. In particular, we will contribute in terms of modelling massive MIMO channels underpinned by heterogeneous correlation structures; performing information theoretic analysis in terms of random matrix theory through shrinkage estimators; robust precoder design for massive MIMO in the presence of channel estimation errors; developing novel channel estimation technique in the presence of severe pilot contamination; and proposing and analysing game theoretic algorithms for autonomous resource allocation and pilot assignments. All the concepts and algorithms developed will be integrated and the radio link layer performance will be assessed using a simulation reference system based on LTE-Advanced standards and its evolution towards 5G. Industrial partners will be engaged throughout the project to ensure industrial relevance of our work.

  • Funder: UKRI Project Code: EP/M024512/1
    Funder Contribution: 244,299 GBP
    Partners: KCL

    While the theory of minimal and constant mean curvature (CMC) surfaces is a purely mathematical one, such surfaces overtly present themselves in nature and are studied in many material sciences. This makes the theory more exciting. If we take a closed wire and dip it in and out of soapy water, the soap film that forms across the loop is in fact a minimal surface and the physical properties of soap films were already studied by Plateau in the 1850s. The air pressure on the sides of soap films is equal and constant. However, if we change the pressure on one side, for instance by blowing air on it, the new surface that we obtain is what we call a soap bubble. A soap bubble is a CMC surface. More precisely, minimal and CMC surfaces are, respectively, mathematical idealisation of soap films and soap bubbles. The mean curvature of a soap film and bubble is a quantity that is proportional to the pressure difference on the sides of the film. The value of the pressure difference, and therefore of the mean curvature, is zero for a soap film/minimal surface and it is non-zero constant for a soap bubble/CMC surface. Since the pressure inside a small bubble is greater than the pressure inside a big one, the constant mean curvature of a small bubble is greater than the constant mean curvature of a big one. Minimal and CMC surfaces also enjoy crucial minimising properties relative to area. Among all surfaces spanning a given boundary, a soap film/minimal surface is one with locally least area; soap bubbles/CMC surfaces locally minimise area under a volume constraint. This project aims to investigate several key geometric properties of minimal and CMC surfaces. Roughly speaking, I intend to prove several results about CMC surfaces embedded in a flat three-dimensional manifold, including area estimates when the surfaces are compact with bounded genus and the ambient manifold is compact. I also plan to study the limits of a sequence of minimal or CMC surfaces embedded in a general three-dimensional manifold.

  • Funder: UKRI Project Code: EP/M020207/1
    Funder Contribution: 977,312 GBP
    Partners: BAM, MTC, RCNDE, BP British Petroleum, ROLLS-ROYCE PLC, Tenaris, University of Salford, Imperial College London

    If imaging required less data, it would enable faster throughput, improved performance in restricted access situations and simpler, cheaper hardware. The information from images enables damage to be accurately quantified within engineering components, avoiding the need to choose between excessive conservatism and unpredicted failures. To enable improved reconstructions from limited data sets, a diverse set of approaches have been identified, incorporating knowledge of physical wave interaction with objects, use of external information, image processing and other techniques. The fellowship will address the broad problem by applying these approaches to several example applications which are of great interest to industry, and will ultimately enable the development of the field of limited data imaging. While primarily focused on NDE (non-destructive evaluation), the applications of this spread to areas including medicine, geophysics and security.

Send a message
How can we help?
We usually respond in a few hours.