1,210 Projects, page 1 of 121
Loading
- Project . 2015 - 2018Funder: UKRI Project Code: EP/M022234/1Funder Contribution: 99,750 GBPPartners: SEVERN TRENT WATER, NTU
The wastewater treatment process (WWTP) plays a critical role in providing clean water. However, emerging and predominately unregulated, bioactive chemicals such as steroids and pharmaceutical drugs are being increasingly detected in surface waters that receive wastewater effluent. Although present at low concentrations, their inherent bioactive nature has been linked to abnormalities in aquatic organisms and there are also water reuse and human health implications. As part of the urban water cycle, the WWTP is the gatekeeper to the surface waters e.g. rivers. Pharmaceuticals enter wastewater treatment from inappropriate disposal of unused drugs to the sink/toilet or via landfill. Prescribed or illicit drug use also has the inevitable consequence of being metabolised in the human body (to parent, Phase I / II metabolites) and excreted in urine, which subsequently enters the WWTP. Coupled with naturally produced and excreted bioactive steroids, the challenge for wastewater treatment is that it was never designed to remove these bioactive chemicals and is inefficient. Evaluating the prevalence and fate of a steroid or pharmaceutical in the WWTP is challenging as human enzymatic metabolism causes the bioactive chemical to exist in multiple forms - parent, Phase I and Phase II metabolites. Phase II metabolites predominate urine excretion and are the starting products entering the wastewater environment. They therefore act as the precursors to the biotransformations that take place during treatment and produce the Phase I and/or parent forms of the bioactive chemical. Before treatment technologies can be developed and evaluated for pharmaceutical and steroid removal in the WWTP, our understanding needs to improve on how the different bioactive chemical forms behave, and their relationships to each other. This means identifying the biotransformations between metabolites and parent forms. To achieve this requires a move from targeted analysis - we analyse for what we expect to see - to develop methods that are non-targeted and search for Phase II metabolites and their associated Phase I / parent forms. Drawing on inspiration from metabolomics approaches used in the biosciences, the aim of this proposal is to develop a novel non-target method to identify bioactive chemical Phase II metabolites and their biotransformation products in wastewater. Knowledge of Phase II metabolite occurrence and fate in the wastewater environment is important in assessing the impact of user behaviour, process and environmental factors or bioactive chemical parent removal. This will inform on WWTP efficiency, provide data for optimising models that predict pharmaceuticals and steroids, and evaluate environmental risk.
- Project . 2015 - 2018Funder: UKRI Project Code: EP/N005422/1Funder Contribution: 307,997 GBPPartners: University of Cambridge
A primary goal of organic chemists is the construction of molecules for applications as diverse as medicines, new materials and biomolecules. The field is constantly driven by the need for new, more efficient methods as well as ways to access molecules which may have previously been impossible. The most important tool at an organic chemist's disposal is undoubtedly catalysis, whereby the use of a small amount of a custom-designed catalyst can permit a reaction to occur under much milder conditions than otherwise, or opens up new chemical pathways altogether. For this reason, innovation in catalysis is central to innovation in organic chemistry. Nature's catalysis is performed by enzymes; evolution has made them phenomenally efficient. Often playing a leading role in enzymatic processes are 'hydrogen bonds', special types of electrostatic attraction which are important in facilitating the chemical reaction between two molecules by bringing them into close proximity with one another or by stabilising the pathway leading to product formation. My research seeks to employ these same interactions, but in the context of small molecules which we can readily synthesise and handle in the lab. This approach to catalysis is very exciting as it is still in its infancy yet offers exciting opportunities for both activation and control. This project will seek to take inspiration from a distinct field within chemistry called Supramolecular Chemistry, which explores the behavior of large molecules which are assembled from smaller ones using multiple weak 'temporary' interactions working in tandem. Hydrogen bonds are very important in this regard but there are a number of other key interactions such as ion pairs and pi-cation interactions which have been shown to be powerful in building up molecular structures. It is our aim to apply several of these interactions together in tandem to design new catalysts that will bind with our reactant in a very well defined orientation. The catalyst will also induce the substrate to react with another molecule, allowing the selective synthesis of one mirror image of a molecule over the other (so-called enantiomers). This is a very important pursuit in science, since the inherent 'handedness' of biological systems means that the different mirror image forms of chiral molecules often have very different effects in the body. This is of particular importance in pharmaceutical applications.
- Project . 2015 - 2016Funder: UKRI Project Code: EP/M014134/1Funder Contribution: 97,162 GBPPartners: NTU
Asphalt pavements are the most commonly road pavements in the UK. Preserving them in a proper state fundamentally affects the economy and quality of life. However, their surveillance and maintenance are cost and time intensive, and asphalt concrete still has to be replaced after 15 years of use. Applying induction heating into the road could make pavements last much longer by stimulating the asphalt`s property of self-healing. Experimental results have found that a crack can be fully induction-healed, for at least 5 times. The efficiency of self-healing, however, depends on the temperature of the material and the temperature should be concentrated in the cracks alone. Thus, the challenge of this research is to discover how to apply energy only locally into the cracks without dispersing energy into undesired spaces. With this purpose, experimental and mathematical models of asphalt concrete self-healing under induction heating will be developed. This research will serve to understand the relationships between induction heating, the particles used to heat the mixture, the heat flow through asphalt concrete and its effect on asphalt self-healing. We will discover the type of particles, intensities and frequencies of induction heating which are more appropriate for healing, how to concentrate the heat in the damaged areas and the relationship between the amount of energy induced and the healing of asphalt concrete.
- Project . 2015 - 2021Funder: UKRI Project Code: EP/N014499/1Funder Contribution: 2,004,300 GBPPartners: University of Salford, University of Edinburgh, University of Liverpool, Durham University, Dudley Group of Hospitals NHS Trust, BioQ8 Ltd, Liverpool Health Partners, North West Coast Academic Health Sci Nwk, Carl Zeiss Ltd, PHE...
As quality of life constantly improves, the average lifespan will continue to increase. Underlining this improvement is the vast amount of the UK government's support to NHS (£133.5 billion in year 2011/12) and the UK pharmaceutical industry's R&D large investment (4.9 billion to R&D in year 2011/12). The expectation of quality healthcare is inevitably high from all stakeholders. Fortunately recent advances in science and technology have enabled us to work towards personalised medicine and preventative care. This approach calls for a collective effort of researchers from a vast spectrum of specialised subjects. Advances in science and engineering is often accompanied by major development of mathematical sciences, as the latter underpin all other sciences. The UoL Centre will consist of a large and multidisciplinary team of applied and pure mathematicians, and statisticians together with healthcare researchers, clinicians and industrialists, collaborating with 15 HEIs and 40 NHS trusts plus other industrial partners and including our strongest groups: MRC Centre in Drug Safety Science, Centre for Cell imaging (CCI for live 3D and 4D imaging), Centre for Mathematical Imaging Techniques (unique in UK), Liverpool Biomedical EM unit, MRC Regenerative Medicine Hub, NIHR Health Protection Research Units, MRC Hub for Trials Methodology Research. Several research themes are highlighted below: Firstly, an improved understanding of the interaction dynamics of cells and tissues is crucial to developing effective future cures for cancer. Much of the current work is in 2D, with restrictive assumptions and without access to real data for modelling. We shall use the unparalleled real data of cell interactions in a 3D setting, generated at UoL's CCI. The real-life images obtained will have low contrast and noise and they will be analysed and enhanced by our imaging team through developing accurate and high resolution imaging models. The main imaging tools needed are segmentation methods (identifying objects such as cells and tissues regions in terms of sizes, shapes and precise boundaries). We shall propose and study a class of new 3D models, using our imaging data and analysis tools, to investigate and predict the spatial-temporal dynamics. Secondly, better models of how drugs are delivered to cells in tissues will improve personalised predictions of drug toxicity. We shall combine novel-imaging data of drug penetration into 3D experimental model systems with multi-scale mathematical models which scale-up from the level of cells to these model systems, with the ultimate aim of making better in-vitro to in-vivo predictions. Thirdly, there exist many competing models and software for imaging processing. However, for real images that have noise and are of low contrast, few methods are robust and accurate. To improve the modelling, applied and pure mathematicians team up to consider using more sophisticated tools of hyperbolic geometry and Riemann surfaces and fractional calculus to meet the demand for accuracy, and, applied mathematicians and statisticians will team up to design better data fidelity terms to model image discrepancies. Fourthly, resistance to current antibiotics means that previously treatable diseases are becoming deadly again. To understand and mitigate this, a better understanding is needed for how this resistance builds up across the human interaction networks and how it depends on antibiotic prescribing practices. To understand these scenarios, the mathematics competition in heterogeneous environments needs to be better understood. Our team links mathematical experts in analysing dynamical systems with experts in antimicrobial resistance and GPs to determine strategies that will mitigate or slow the development of anti-microbial resistance. Our research themes are aligned with, and will add value to, existing and current UoL and Research Council strategic investments, activities and future plans.
- Project . 2015 - 2019Funder: UKRI Project Code: EP/N014189/1Funder Contribution: 1,218,040 GBPPartners: University of Southampton
The relentless growth of the amount, variety, availability, and the rate of change of data has profoundly transformed essentially all aspects of human life. The Big Data revolution has created a paradox: While we create and collect more data than ever before, it is not always easy to unlock the information it contains. To turn the easy availability of data into a major scientific and economic advantage, it is imperative that we create analytic tools that would be equal to the challenge presented by the complexity of modern data. In recent years, breakthroughs in topological data analysis and machine learning have paved the way for significant progress towards creating efficient and reliable tools to extract information from data. Our proposal has been designed to address the scope of the call as follows. To 'convert the vast amounts of data produced into understandable, actionable information' we will create a powerful fusion of machine learning, statistics, and topological data analysis. This combination of statistical insight, with computational power of machine learning with the flexibility, scalability, and visualisation tools of topology will allow a significant reduction of complexity of the data under study. The results will be output in a form that is best suited to the intended application or a scientific problem at hand. This way, we will create a seamless pathway from data analysis to implementation, which will allow us to control every step of this process. In particular, the intended end user will be able to query the results of the analysis to extract the information relevant to them. In summary, our work will provide tools to extract information from complex data sets to support user investigations or decisions. It is now well established that a main challenge of Big Data is how 'to efficiently and intelligently extract knowledge from heterogeneous, distributed data while retaining the context necessary for its interpretation'. This will be addressed first of all by developing techniques for dealing with heterogenous data. A main strength of topology is its ability to identify simple components in complex systems. It can also provide guiding principles on how to combine elements to create a model of a complex system. It also provides numerical techniques to control the overall shape of the resulting model to ensure that it fits with the original constraints. We will use the particular strengths of machine learning, statistics and topology to identify the main properties of data, which will then be combined to provide an overall analysis of the data. For example, a collection of text documents can be analysed using machine learning techniques to create a graph which captures similarities between documents in a topological way. This is an efficient way to classify a corpus of documents according to a desired set of keywords. An important part of our investigation will be to develop robust techniques of data fusion. This is important in many applications. One of our main applications will address the problem of creating a set of descriptors to diagnose and treat asthma. There are five main pathways for clinical diagnosis of asthma, each supported by data. To create a coherent picture of the disease we need to understand how to combine the information contained in these separate data sets to create the so called 'asthma handprint' which is a major challenge in this part of medicine. Every novel methodology of data analysis has to prove that its 'techniques are realistic, compatible and scalable with real- world services and hardware systems'. The best way to do that is to engage from the outset with challenging applications , and to ensure that theoretic and modelling solutions fit well the intended applications. We offer a unique synergy between theory and modelling as well as world-class facilities in medicine and chemistry which will provide a strict test for our ideas and results.
- Project . 2015 - 2019Funder: UKRI Project Code: EP/M02976X/1Funder Contribution: 335,616 GBPPartners: University of Surrey
Many industrial processing operations depend on feed materials that are fine powders with poor handling characteristics, which have to be rectified by granulation to form coarser granules. Generally wet granulation is employed, in which a binder is added to the powder in a mixer usually in batch processes. Continuous Twin Screw Granulation (TSG) has considerable potential, eg in the pharmaceutical sector, because of the flexibility in throughput and equipment design, reproducibility, short residence times, smaller liquid/solid ratios and also the ability to granulate difficult to process formulations. However, there remain significant technical issues that limit its widespread use and a greater understanding of the process is required to meet regulatory requirements. Moreover, encapsulated APIs (Active Pharmaceutical Ingredients) are of increasing interest and the development of a TSG process that did not damage such encapsulates would significantly extend applications. Experimental optimisation of TSG is expensive and often sub-optimal because of the high costs of APIs and does not lead to a more generic understanding of the process. Computational modelling of the behaviour of individual feed particles during the process will overcome these limitations. The Distinct Element Method (DEM) is the most widely used method but has rarely been applied to the number of particles in a TSG extruder (~ 55 million) and such examples involve simplified interparticle interactions e.g. by assuming that the particles are smooth and spherical and any liquid is present as discrete bridges rather than the greater saturation states associated with granulation. The project will be based on a multiscale strategy to develop advanced interaction laws that are more representative of real systems. The bulk and interfacial properties of a swelling particulate binder such as microcrystalline cellulose will be modelled using Coarse-grained Molecular Dynamics to derive inputs into a meso-scale Finite Discrete Element Method model of formulations that include hard particles and a viscous polymeric binder (hydroxypropylcellulose). Elastic particles (e.g. lactose and encapsulates) with viscous binder formulations will be modelled using the Fast Multi-pole Boundary Element Method. These micro- and meso-scale models will be used to provide closure for a DEM model of TSG. It will involve collaboration with the Chinese Academy of Science, which has pioneered the application of massively parallel high performance computing with GPU clusters to discrete modelling such as DEM, albeit with existing simpler interaction laws. An extensive experimental programme will be deployed to measure physical inputs and validate the models. The screw design and operating conditions of TSG for the formulations considered will be optimised using DEM and the results validated empirically. Optimisation criteria will include the granule size distribution, the quality of tablets for granules produced from the lactose formulation and the minimisation of damage to encapsulates. The primary benefit will be to provide a modelling toolbox for TSG for enabling more rapid and cost-effective optimisation, and allow encapsulated APIs to be processed. Detailed data post-processing will elucidate mechanistic information that will be used to develop regime performance maps. The multiscale modelling will have applications to a wide range of multiphase systems as exemplified by a large fraction of consumer products, catalyst pastes for extrusion processes, and agriculture products such as pesticides. The micro- and mesoscopic methods have generic applications for studying the bulk and interfacial behaviour of hard and soft particles and also droplets in emulsions. The combination of advanced modelling and implementation on massively parallel high performance GPU clusters will allow unprecedented applications to multiphase systems of enormous complexity.
- Project . 2015 - 2018Funder: UKRI Project Code: EP/M02525X/1Funder Contribution: 341,698 GBPPartners: City, University of London
The product rule of the all familiar operation of taking derivatives of real valued functions has a plethora of generalisations and applications in algebra. It leads to the notion of derivations of algebras - these are linear endomorphisms of an algebra satifying the product rule. They represent the classes of the first Hochschild cohomology of an algebra. The first Hochschild cohomology of an algebra turns out to be a Lie algebra, and more precisely, a restricted Lie algebra if the underlying base ring is a field of positive characteristic. The (restricted) Lie algebra structure extends to all positive degrees in Hochschild cohomology - this goes back to pioneering work of Gerstenhaber on defornations of algebras. Modular representation theory of finite groups seeks to understand the connections between the structure of finite groups and the associated group algebras. Many of the conjectures that drive this area are - to date mysterious - numerical coincidences relating invariants of finite group algebras to invariants of the underlying groups. The sophisticated cohomological technology hinted at in the previous paragraph is expected to yield some insight regarding these coincidences, and the present proposal puts the focus on some precise and unexplored invariance properties of certain groups of integrable derivations under Morita, derived, or stable equivalences between indecomposable algebra factors of finite group algebras, their character theory, their automorphism groups, and the local structure of finite groups.
- Project . 2015 - 2017Funder: UKRI Project Code: EP/M009718/1Funder Contribution: 100,454 GBPPartners: QMUL
The theory of operator algebras goes back to Murray, von Neumann, Gelfand and Naimark. The original motivation was to provide a mathematical foundation for quantum mechanics. At the same time, from the very beginning of the subject, it was anticipated that operator algebras form very interesting structures on their own right and will have applications to unitary representations of groups and operator theory in Hilbert space. Actually, much more turned out to be true. After some dramatic and unexpected developments, the theory of operator algebras has established itself as a very active and highly interdisciplinary research area. Not only do there exist - as initially envisioned - strong connections to quantum physics as well as representation theory and operator theory, operator algebras nowadays have far reaching applications in various mathematical disciplines like functional analysis, algebra, geometric group theory, geometry, topology or dynamical systems. One of the most important classes of operator algebras is given by C*-algebras, which are defined as norm-closed, self-adjoint algebras of bounded linear operators on a Hilbert space. As in many areas in mathematics, advances in the theory of C*-algebras went hand in hand with the discovery of interesting and illuminating examples, the most prominent ones being group C*-algebras and C*-algebras attached to dynamical systems, so-called crossed products. The main objects of study in this research project are given by semigroup C*-algebras, which are natural generalizations of group C*-algebras. Our goal is to analyse the structure of semigroup C*-algebras and to use this construction as a tool to study groups and their subsemigroups from the point of view of geometric group theory. Closely related to this, this project also aims at a better understanding of the interplay between C*-algebras and dynamical systems. Our project lies at the frontier of current research. We take up recent advances in semigroup C*-algebras, classification of C*-algebras, the interplay between C*-algebras and symbolic dynamics, as well as the discovery of rigidity phenomena in operator algebras and dynamical systems. One of the key characteristics of our research project is its high interdisciplinary character. It lies at the interface of several research areas in mathematics and brings together expertise from different fields. This takes up the trend in mathematics that interactions between different branches are becoming more and more important. Therefore, the mathematical community as a whole benefits through an active and inspiring exchange of ideas.
- Project . 2015 - 2018Funder: UKRI Project Code: EP/M016005/1Funder Contribution: 302,791 GBPPartners: Three, TRTUK, University of London, VCE Mobile & Personal Comm Ltd
Spectrum is a precious but scarce natural resource. In the UK, Ofcom will free up the analogue TV spectrum at 800MHz (together with the available 2.6GHz band) for 4G, which has already raised £2.34 billion for the national purse. According to Ofcom, the amount of data Britons consume on the move each month has already hit 20 million gigabytes, mainly due to users' engagement of video, TV and films while on the move. It is also a common understanding for the mobile operators that by 2020 a 1000 times increase in the system capacity will be needed to avoid mobile networks grinding to a halt. Maximising spectral efficiency, which is limited by interference and fading for wireless networks including 4G, is therefore a major issue. An emerging idea, which is championed by Alcatel-Lucent and has already received serious consideration by vendors and operators is that of a massive MIMO antenna system. This technology has the potential to unlock the issue of spectrum scarcity and to enhance spectrum usage tremendously by enabling simultaneous access of tens or hundreds of terminals in the same time-frequency resource. In order for massive MIMO technology to attain its utmost potential, it is important that various challenges in terms of channel estimation and acquisition due to pilot contamination, fast spatial-temporal variations in signal power and autonomous resource allocation, in particular in the presence of simultaneous access of a large number of users need to be addressed. The focus of this project is on tackling these fundamental challenges, by advancing aspects of information theory, estimation theory and network optimisations. In particular, we will contribute in terms of modelling massive MIMO channels underpinned by heterogeneous correlation structures; performing information theoretic analysis in terms of random matrix theory through shrinkage estimators; robust precoder design for massive MIMO in the presence of channel estimation errors; developing novel channel estimation technique in the presence of severe pilot contamination; and proposing and analysing game theoretic algorithms for autonomous resource allocation and pilot assignments. All the concepts and algorithms developed will be integrated and the radio link layer performance will be assessed using a simulation reference system based on LTE-Advanced standards and its evolution towards 5G. Industrial partners will be engaged throughout the project to ensure industrial relevance of our work.
- Project . 2015 - 2016Funder: UKRI Project Code: EP/M003620/1Funder Contribution: 161,541 GBPPartners: University of Warwick
The Warwick EPSRC mathematics symposium is organised annually by the University of Warwick with the support of the EPSRC for the benefit of the mathematical sciences community in the UK. It brings leading national and international experts together with UK researchers in a year-long programme of research activities focused on an emerging theme in the mathematical sciences. The proposed symposium for the 2015-16 academic year will concentrate on the theme of "Fluctuation-driven phenomena and large deviations". In very general terms, the symposium will constitute an interdisciplinary focus on understanding the consequences of the interplay between stochasticity and nonlinearity, a recurrent challenge in many areas of the mathematical sciences, engineering and industry. Stochastic processes play a fundamental role in the mathematical sciences, both as tools for constructing models and as abstract mathematical structures in their own right. When nonlinear interactions between stochastic processes are introduced, however, the rigorous understanding of the resulting equations in terms of stochastic analysis becomes very challenging. Mean field theories are useful heuristics which are commonly employed outside of mathematics for dealing with this problem. Mean field theories in one way or another usually involve replacing random variables by their mean and assuming that fluctuations about the mean are approximately Gaussian distributed. In some cases, such models provide a good description of the original system and can be rigorously justified. In many cases they do not. Understanding the latter case, where mean-field models fail, is the central challenge of this symposium. We use "fluctuation driven phenomena" as a generic term to describe the kinds of effects which are observed when mean field theories fail. The challenges stem from the fact that the rich phenomenology of deterministic nonlinear dynamics (singularities, nonlinear resonance, chaos and so forth) is reflected in the stochastic context by a variety of interesting and sometimes unintuitive behaviours: long range correlations, strongly non-Gaussian statistics, coherent structures, absorbing state phase transitions, heavy-tailed probability distributions and enhanced probabilities of large deviations. Such phenomena are found throughout mathematics, both pure and applied, the physical, biological and engineering sciences as well as presenting particular problems to industrialists and policymakers. Contemporary problems such as the forecasting of extreme weather events, the design of marine infrastructure to withstand so-called "rogue waves", quantifying the probability of fluctuation driven transitions or "tipping points" in the climate system or estimating the redundancy required to ensure that infrastructure systems are resilient to shocks all require a step change in our ability to model and predict such fluctuation-driven phenomena. The programme of research activities constituting this symposium will therefore range from the very theoretical to the very applied. At the theoretical end we have random matrix theory which has recently emerged as a powerful tool for analysing the statistics of stochastic processes which are strongly non-Gaussian without the need to go via perturbative techniques developed in the physical sciences such as the renormalisation group. At the applied end we have questions of existential importance to the insurance industry such as how to cost the risk of extreme natural disasters and quantify their interaction with risks inherent in human-built systems. In between we have research on the connections between large deviation theory and nonequilibrium statistical mechanics, extreme events in the Earth sciences, randomness in the biological sciences and the latest numerical algorithms for computing rare events, a topic which has seen strong growth recent years.
1,210 Projects, page 1 of 121
Loading
- Project . 2015 - 2018Funder: UKRI Project Code: EP/M022234/1Funder Contribution: 99,750 GBPPartners: SEVERN TRENT WATER, NTU
The wastewater treatment process (WWTP) plays a critical role in providing clean water. However, emerging and predominately unregulated, bioactive chemicals such as steroids and pharmaceutical drugs are being increasingly detected in surface waters that receive wastewater effluent. Although present at low concentrations, their inherent bioactive nature has been linked to abnormalities in aquatic organisms and there are also water reuse and human health implications. As part of the urban water cycle, the WWTP is the gatekeeper to the surface waters e.g. rivers. Pharmaceuticals enter wastewater treatment from inappropriate disposal of unused drugs to the sink/toilet or via landfill. Prescribed or illicit drug use also has the inevitable consequence of being metabolised in the human body (to parent, Phase I / II metabolites) and excreted in urine, which subsequently enters the WWTP. Coupled with naturally produced and excreted bioactive steroids, the challenge for wastewater treatment is that it was never designed to remove these bioactive chemicals and is inefficient. Evaluating the prevalence and fate of a steroid or pharmaceutical in the WWTP is challenging as human enzymatic metabolism causes the bioactive chemical to exist in multiple forms - parent, Phase I and Phase II metabolites. Phase II metabolites predominate urine excretion and are the starting products entering the wastewater environment. They therefore act as the precursors to the biotransformations that take place during treatment and produce the Phase I and/or parent forms of the bioactive chemical. Before treatment technologies can be developed and evaluated for pharmaceutical and steroid removal in the WWTP, our understanding needs to improve on how the different bioactive chemical forms behave, and their relationships to each other. This means identifying the biotransformations between metabolites and parent forms. To achieve this requires a move from targeted analysis - we analyse for what we expect to see - to develop methods that are non-targeted and search for Phase II metabolites and their associated Phase I / parent forms. Drawing on inspiration from metabolomics approaches used in the biosciences, the aim of this proposal is to develop a novel non-target method to identify bioactive chemical Phase II metabolites and their biotransformation products in wastewater. Knowledge of Phase II metabolite occurrence and fate in the wastewater environment is important in assessing the impact of user behaviour, process and environmental factors or bioactive chemical parent removal. This will inform on WWTP efficiency, provide data for optimising models that predict pharmaceuticals and steroids, and evaluate environmental risk.
- Project . 2015 - 2018Funder: UKRI Project Code: EP/N005422/1Funder Contribution: 307,997 GBPPartners: University of Cambridge
A primary goal of organic chemists is the construction of molecules for applications as diverse as medicines, new materials and biomolecules. The field is constantly driven by the need for new, more efficient methods as well as ways to access molecules which may have previously been impossible. The most important tool at an organic chemist's disposal is undoubtedly catalysis, whereby the use of a small amount of a custom-designed catalyst can permit a reaction to occur under much milder conditions than otherwise, or opens up new chemical pathways altogether. For this reason, innovation in catalysis is central to innovation in organic chemistry. Nature's catalysis is performed by enzymes; evolution has made them phenomenally efficient. Often playing a leading role in enzymatic processes are 'hydrogen bonds', special types of electrostatic attraction which are important in facilitating the chemical reaction between two molecules by bringing them into close proximity with one another or by stabilising the pathway leading to product formation. My research seeks to employ these same interactions, but in the context of small molecules which we can readily synthesise and handle in the lab. This approach to catalysis is very exciting as it is still in its infancy yet offers exciting opportunities for both activation and control. This project will seek to take inspiration from a distinct field within chemistry called Supramolecular Chemistry, which explores the behavior of large molecules which are assembled from smaller ones using multiple weak 'temporary' interactions working in tandem. Hydrogen bonds are very important in this regard but there are a number of other key interactions such as ion pairs and pi-cation interactions which have been shown to be powerful in building up molecular structures. It is our aim to apply several of these interactions together in tandem to design new catalysts that will bind with our reactant in a very well defined orientation. The catalyst will also induce the substrate to react with another molecule, allowing the selective synthesis of one mirror image of a molecule over the other (so-called enantiomers). This is a very important pursuit in science, since the inherent 'handedness' of biological systems means that the different mirror image forms of chiral molecules often have very different effects in the body. This is of particular importance in pharmaceutical applications.
- Project . 2015 - 2016Funder: UKRI Project Code: EP/M014134/1Funder Contribution: 97,162 GBPPartners: NTU
Asphalt pavements are the most commonly road pavements in the UK. Preserving them in a proper state fundamentally affects the economy and quality of life. However, their surveillance and maintenance are cost and time intensive, and asphalt concrete still has to be replaced after 15 years of use. Applying induction heating into the road could make pavements last much longer by stimulating the asphalt`s property of self-healing. Experimental results have found that a crack can be fully induction-healed, for at least 5 times. The efficiency of self-healing, however, depends on the temperature of the material and the temperature should be concentrated in the cracks alone. Thus, the challenge of this research is to discover how to apply energy only locally into the cracks without dispersing energy into undesired spaces. With this purpose, experimental and mathematical models of asphalt concrete self-healing under induction heating will be developed. This research will serve to understand the relationships between induction heating, the particles used to heat the mixture, the heat flow through asphalt concrete and its effect on asphalt self-healing. We will discover the type of particles, intensities and frequencies of induction heating which are more appropriate for healing, how to concentrate the heat in the damaged areas and the relationship between the amount of energy induced and the healing of asphalt concrete.
- Project . 2015 - 2021Funder: UKRI Project Code: EP/N014499/1Funder Contribution: 2,004,300 GBPPartners: University of Salford, University of Edinburgh, University of Liverpool, Durham University, Dudley Group of Hospitals NHS Trust, BioQ8 Ltd, Liverpool Health Partners, North West Coast Academic Health Sci Nwk, Carl Zeiss Ltd, PHE...
As quality of life constantly improves, the average lifespan will continue to increase. Underlining this improvement is the vast amount of the UK government's support to NHS (£133.5 billion in year 2011/12) and the UK pharmaceutical industry's R&D large investment (4.9 billion to R&D in year 2011/12). The expectation of quality healthcare is inevitably high from all stakeholders. Fortunately recent advances in science and technology have enabled us to work towards personalised medicine and preventative care. This approach calls for a collective effort of researchers from a vast spectrum of specialised subjects. Advances in science and engineering is often accompanied by major development of mathematical sciences, as the latter underpin all other sciences. The UoL Centre will consist of a large and multidisciplinary team of applied and pure mathematicians, and statisticians together with healthcare researchers, clinicians and industrialists, collaborating with 15 HEIs and 40 NHS trusts plus other industrial partners and including our strongest groups: MRC Centre in Drug Safety Science, Centre for Cell imaging (CCI for live 3D and 4D imaging), Centre for Mathematical Imaging Techniques (unique in UK), Liverpool Biomedical EM unit, MRC Regenerative Medicine Hub, NIHR Health Protection Research Units, MRC Hub for Trials Methodology Research. Several research themes are highlighted below: Firstly, an improved understanding of the interaction dynamics of cells and tissues is crucial to developing effective future cures for cancer. Much of the current work is in 2D, with restrictive assumptions and without access to real data for modelling. We shall use the unparalleled real data of cell interactions in a 3D setting, generated at UoL's CCI. The real-life images obtained will have low contrast and noise and they will be analysed and enhanced by our imaging team through developing accurate and high resolution imaging models. The main imaging tools needed are segmentation methods (identifying objects such as cells and tissues regions in terms of sizes, shapes and precise boundaries). We shall propose and study a class of new 3D models, using our imaging data and analysis tools, to investigate and predict the spatial-temporal dynamics. Secondly, better models of how drugs are delivered to cells in tissues will improve personalised predictions of drug toxicity. We shall combine novel-imaging data of drug penetration into 3D experimental model systems with multi-scale mathematical models which scale-up from the level of cells to these model systems, with the ultimate aim of making better in-vitro to in-vivo predictions. Thirdly, there exist many competing models and software for imaging processing. However, for real images that have noise and are of low contrast, few methods are robust and accurate. To improve the modelling, applied and pure mathematicians team up to consider using more sophisticated tools of hyperbolic geometry and Riemann surfaces and fractional calculus to meet the demand for accuracy, and, applied mathematicians and statisticians will team up to design better data fidelity terms to model image discrepancies. Fourthly, resistance to current antibiotics means that previously treatable diseases are becoming deadly again. To understand and mitigate this, a better understanding is needed for how this resistance builds up across the human interaction networks and how it depends on antibiotic prescribing practices. To understand these scenarios, the mathematics competition in heterogeneous environments needs to be better understood. Our team links mathematical experts in analysing dynamical systems with experts in antimicrobial resistance and GPs to determine strategies that will mitigate or slow the development of anti-microbial resistance. Our research themes are aligned with, and will add value to, existing and current UoL and Research Council strategic investments, activities and future plans.
- Project . 2015 - 2019Funder: UKRI Project Code: EP/N014189/1Funder Contribution: 1,218,040 GBPPartners: University of Southampton
The relentless growth of the amount, variety, availability, and the rate of change of data has profoundly transformed essentially all aspects of human life. The Big Data revolution has created a paradox: While we create and collect more data than ever before, it is not always easy to unlock the information it contains. To turn the easy availability of data into a major scientific and economic advantage, it is imperative that we create analytic tools that would be equal to the challenge presented by the complexity of modern data. In recent years, breakthroughs in topological data analysis and machine learning have paved the way for significant progress towards creating efficient and reliable tools to extract information from data. Our proposal has been designed to address the scope of the call as follows. To 'convert the vast amounts of data produced into understandable, actionable information' we will create a powerful fusion of machine learning, statistics, and topological data analysis. This combination of statistical insight, with computational power of machine learning with the flexibility, scalability, and visualisation tools of topology will allow a significant reduction of complexity of the data under study. The results will be output in a form that is best suited to the intended application or a scientific problem at hand. This way, we will create a seamless pathway from data analysis to implementation, which will allow us to control every step of this process. In particular, the intended end user will be able to query the results of the analysis to extract the information relevant to them. In summary, our work will provide tools to extract information from complex data sets to support user investigations or decisions. It is now well established that a main challenge of Big Data is how 'to efficiently and intelligently extract knowledge from heterogeneous, distributed data while retaining the context necessary for its interpretation'. This will be addressed first of all by developing techniques for dealing with heterogenous data. A main strength of topology is its ability to identify simple components in complex systems. It can also provide guiding principles on how to combine elements to create a model of a complex system. It also provides numerical techniques to control the overall shape of the resulting model to ensure that it fits with the original constraints. We will use the particular strengths of machine learning, statistics and topology to identify the main properties of data, which will then be combined to provide an overall analysis of the data. For example, a collection of text documents can be analysed using machine learning techniques to create a graph which captures similarities between documents in a topological way. This is an efficient way to classify a corpus of documents according to a desired set of keywords. An important part of our investigation will be to develop robust techniques of data fusion. This is important in many applications. One of our main applications will address the problem of creating a set of descriptors to diagnose and treat asthma. There are five main pathways for clinical diagnosis of asthma, each supported by data. To create a coherent picture of the disease we need to understand how to combine the information contained in these separate data sets to create the so called 'asthma handprint' which is a major challenge in this part of medicine. Every novel methodology of data analysis has to prove that its 'techniques are realistic, compatible and scalable with real- world services and hardware systems'. The best way to do that is to engage from the outset with challenging applications , and to ensure that theoretic and modelling solutions fit well the intended applications. We offer a unique synergy between theory and modelling as well as world-class facilities in medicine and chemistry which will provide a strict test for our ideas and results.
- Project . 2015 - 2019Funder: UKRI Project Code: EP/M02976X/1Funder Contribution: 335,616 GBPPartners: University of Surrey
Many industrial processing operations depend on feed materials that are fine powders with poor handling characteristics, which have to be rectified by granulation to form coarser granules. Generally wet granulation is employed, in which a binder is added to the powder in a mixer usually in batch processes. Continuous Twin Screw Granulation (TSG) has considerable potential, eg in the pharmaceutical sector, because of the flexibility in throughput and equipment design, reproducibility, short residence times, smaller liquid/solid ratios and also the ability to granulate difficult to process formulations. However, there remain significant technical issues that limit its widespread use and a greater understanding of the process is required to meet regulatory requirements. Moreover, encapsulated APIs (Active Pharmaceutical Ingredients) are of increasing interest and the development of a TSG process that did not damage such encapsulates would significantly extend applications. Experimental optimisation of TSG is expensive and often sub-optimal because of the high costs of APIs and does not lead to a more generic understanding of the process. Computational modelling of the behaviour of individual feed particles during the process will overcome these limitations. The Distinct Element Method (DEM) is the most widely used method but has rarely been applied to the number of particles in a TSG extruder (~ 55 million) and such examples involve simplified interparticle interactions e.g. by assuming that the particles are smooth and spherical and any liquid is present as discrete bridges rather than the greater saturation states associated with granulation. The project will be based on a multiscale strategy to develop advanced interaction laws that are more representative of real systems. The bulk and interfacial properties of a swelling particulate binder such as microcrystalline cellulose will be modelled using Coarse-grained Molecular Dynamics to derive inputs into a meso-scale Finite Discrete Element Method model of formulations that include hard particles and a viscous polymeric binder (hydroxypropylcellulose). Elastic particles (e.g. lactose and encapsulates) with viscous binder formulations will be modelled using the Fast Multi-pole Boundary Element Method. These micro- and meso-scale models will be used to provide closure for a DEM model of TSG. It will involve collaboration with the Chinese Academy of Science, which has pioneered the application of massively parallel high performance computing with GPU clusters to discrete modelling such as DEM, albeit with existing simpler interaction laws. An extensive experimental programme will be deployed to measure physical inputs and validate the models. The screw design and operating conditions of TSG for the formulations considered will be optimised using DEM and the results validated empirically. Optimisation criteria will include the granule size distribution, the quality of tablets for granules produced from the lactose formulation and the minimisation of damage to encapsulates. The primary benefit will be to provide a modelling toolbox for TSG for enabling more rapid and cost-effective optimisation, and allow encapsulated APIs to be processed. Detailed data post-processing will elucidate mechanistic information that will be used to develop regime performance maps. The multiscale modelling will have applications to a wide range of multiphase systems as exemplified by a large fraction of consumer products, catalyst pastes for extrusion processes, and agriculture products such as pesticides. The micro- and mesoscopic methods have generic applications for studying the bulk and interfacial behaviour of hard and soft particles and also droplets in emulsions. The combination of advanced modelling and implementation on massively parallel high performance GPU clusters will allow unprecedented applications to multiphase systems of enormous complexity.
- Project . 2015 - 2018Funder: UKRI Project Code: EP/M02525X/1Funder Contribution: 341,698 GBPPartners: City, University of London
The product rule of the all familiar operation of taking derivatives of real valued functions has a plethora of generalisations and applications in algebra. It leads to the notion of derivations of algebras - these are linear endomorphisms of an algebra satifying the product rule. They represent the classes of the first Hochschild cohomology of an algebra. The first Hochschild cohomology of an algebra turns out to be a Lie algebra, and more precisely, a restricted Lie algebra if the underlying base ring is a field of positive characteristic. The (restricted) Lie algebra structure extends to all positive degrees in Hochschild cohomology - this goes back to pioneering work of Gerstenhaber on defornations of algebras. Modular representation theory of finite groups seeks to understand the connections between the structure of finite groups and the associated group algebras. Many of the conjectures that drive this area are - to date mysterious - numerical coincidences relating invariants of finite group algebras to invariants of the underlying groups. The sophisticated cohomological technology hinted at in the previous paragraph is expected to yield some insight regarding these coincidences, and the present proposal puts the focus on some precise and unexplored invariance properties of certain groups of integrable derivations under Morita, derived, or stable equivalences between indecomposable algebra factors of finite group algebras, their character theory, their automorphism groups, and the local structure of finite groups.
- Project . 2015 - 2017Funder: UKRI Project Code: EP/M009718/1Funder Contribution: 100,454 GBPPartners: QMUL
The theory of operator algebras goes back to Murray, von Neumann, Gelfand and Naimark. The original motivation was to provide a mathematical foundation for quantum mechanics. At the same time, from the very beginning of the subject, it was anticipated that operator algebras form very interesting structures on their own right and will have applications to unitary representations of groups and operator theory in Hilbert space. Actually, much more turned out to be true. After some dramatic and unexpected developments, the theory of operator algebras has established itself as a very active and highly interdisciplinary research area. Not only do there exist - as initially envisioned - strong connections to quantum physics as well as representation theory and operator theory, operator algebras nowadays have far reaching applications in various mathematical disciplines like functional analysis, algebra, geometric group theory, geometry, topology or dynamical systems. One of the most important classes of operator algebras is given by C*-algebras, which are defined as norm-closed, self-adjoint algebras of bounded linear operators on a Hilbert space. As in many areas in mathematics, advances in the theory of C*-algebras went hand in hand with the discovery of interesting and illuminating examples, the most prominent ones being group C*-algebras and C*-algebras attached to dynamical systems, so-called crossed products. The main objects of study in this research project are given by semigroup C*-algebras, which are natural generalizations of group C*-algebras. Our goal is to analyse the structure of semigroup C*-algebras and to use this construction as a tool to study groups and their subsemigroups from the point of view of geometric group theory. Closely related to this, this project also aims at a better understanding of the interplay between C*-algebras and dynamical systems. Our project lies at the frontier of current research. We take up recent advances in semigroup C*-algebras, classification of C*-algebras, the interplay between C*-algebras and symbolic dynamics, as well as the discovery of rigidity phenomena in operator algebras and dynamical systems. One of the key characteristics of our research project is its high interdisciplinary character. It lies at the interface of several research areas in mathematics and brings together expertise from different fields. This takes up the trend in mathematics that interactions between different branches are becoming more and more important. Therefore, the mathematical community as a whole benefits through an active and inspiring exchange of ideas.
- Project . 2015 - 2018Funder: UKRI Project Code: EP/M016005/1Funder Contribution: 302,791 GBPPartners: Three, TRTUK, University of London, VCE Mobile & Personal Comm Ltd
Spectrum is a precious but scarce natural resource. In the UK, Ofcom will free up the analogue TV spectrum at 800MHz (together with the available 2.6GHz band) for 4G, which has already raised £2.34 billion for the national purse. According to Ofcom, the amount of data Britons consume on the move each month has already hit 20 million gigabytes, mainly due to users' engagement of video, TV and films while on the move. It is also a common understanding for the mobile operators that by 2020 a 1000 times increase in the system capacity will be needed to avoid mobile networks grinding to a halt. Maximising spectral efficiency, which is limited by interference and fading for wireless networks including 4G, is therefore a major issue. An emerging idea, which is championed by Alcatel-Lucent and has already received serious consideration by vendors and operators is that of a massive MIMO antenna system. This technology has the potential to unlock the issue of spectrum scarcity and to enhance spectrum usage tremendously by enabling simultaneous access of tens or hundreds of terminals in the same time-frequency resource. In order for massive MIMO technology to attain its utmost potential, it is important that various challenges in terms of channel estimation and acquisition due to pilot contamination, fast spatial-temporal variations in signal power and autonomous resource allocation, in particular in the presence of simultaneous access of a large number of users need to be addressed. The focus of this project is on tackling these fundamental challenges, by advancing aspects of information theory, estimation theory and network optimisations. In particular, we will contribute in terms of modelling massive MIMO channels underpinned by heterogeneous correlation structures; performing information theoretic analysis in terms of random matrix theory through shrinkage estimators; robust precoder design for massive MIMO in the presence of channel estimation errors; developing novel channel estimation technique in the presence of severe pilot contamination; and proposing and analysing game theoretic algorithms for autonomous resource allocation and pilot assignments. All the concepts and algorithms developed will be integrated and the radio link layer performance will be assessed using a simulation reference system based on LTE-Advanced standards and its evolution towards 5G. Industrial partners will be engaged throughout the project to ensure industrial relevance of our work.
- Project . 2015 - 2016Funder: UKRI Project Code: EP/M003620/1Funder Contribution: 161,541 GBPPartners: University of Warwick
The Warwick EPSRC mathematics symposium is organised annually by the University of Warwick with the support of the EPSRC for the benefit of the mathematical sciences community in the UK. It brings leading national and international experts together with UK researchers in a year-long programme of research activities focused on an emerging theme in the mathematical sciences. The proposed symposium for the 2015-16 academic year will concentrate on the theme of "Fluctuation-driven phenomena and large deviations". In very general terms, the symposium will constitute an interdisciplinary focus on understanding the consequences of the interplay between stochasticity and nonlinearity, a recurrent challenge in many areas of the mathematical sciences, engineering and industry. Stochastic processes play a fundamental role in the mathematical sciences, both as tools for constructing models and as abstract mathematical structures in their own right. When nonlinear interactions between stochastic processes are introduced, however, the rigorous understanding of the resulting equations in terms of stochastic analysis becomes very challenging. Mean field theories are useful heuristics which are commonly employed outside of mathematics for dealing with this problem. Mean field theories in one way or another usually involve replacing random variables by their mean and assuming that fluctuations about the mean are approximately Gaussian distributed. In some cases, such models provide a good description of the original system and can be rigorously justified. In many cases they do not. Understanding the latter case, where mean-field models fail, is the central challenge of this symposium. We use "fluctuation driven phenomena" as a generic term to describe the kinds of effects which are observed when mean field theories fail. The challenges stem from the fact that the rich phenomenology of deterministic nonlinear dynamics (singularities, nonlinear resonance, chaos and so forth) is reflected in the stochastic context by a variety of interesting and sometimes unintuitive behaviours: long range correlations, strongly non-Gaussian statistics, coherent structures, absorbing state phase transitions, heavy-tailed probability distributions and enhanced probabilities of large deviations. Such phenomena are found throughout mathematics, both pure and applied, the physical, biological and engineering sciences as well as presenting particular problems to industrialists and policymakers. Contemporary problems such as the forecasting of extreme weather events, the design of marine infrastructure to withstand so-called "rogue waves", quantifying the probability of fluctuation driven transitions or "tipping points" in the climate system or estimating the redundancy required to ensure that infrastructure systems are resilient to shocks all require a step change in our ability to model and predict such fluctuation-driven phenomena. The programme of research activities constituting this symposium will therefore range from the very theoretical to the very applied. At the theoretical end we have random matrix theory which has recently emerged as a powerful tool for analysing the statistics of stochastic processes which are strongly non-Gaussian without the need to go via perturbative techniques developed in the physical sciences such as the renormalisation group. At the applied end we have questions of existential importance to the insurance industry such as how to cost the risk of extreme natural disasters and quantify their interaction with risks inherent in human-built systems. In between we have research on the connections between large deviation theory and nonequilibrium statistical mechanics, extreme events in the Earth sciences, randomness in the biological sciences and the latest numerical algorithms for computing rare events, a topic which has seen strong growth recent years.