Advanced search in
Projects
arrow_drop_down
Searching FieldsTerms
Any field
arrow_drop_down
includes
arrow_drop_down
377 Projects

  • 2012-2021
  • UKRI|EPSRC
  • 2013
  • 2016

10
arrow_drop_down
  • Funder: UKRI Project Code: EP/K031805/1
    Funder Contribution: 221,071 GBP

    During the last twenty years mathematics and physics have significantly influenced each other and became highly entangled. Mathematical physics was always producing a wide variety of new concepts and problems that became important subjects of the pure mathematical research. The growth of gauge, gravity and string theories have made the relation between these subjects closer than ever before. An important driving force was the discovery of quantum groups and of the gauge/gravity dualities. Here the leading role was played by the the so-called AdS/CFT duality and the underlying integrable structure of it. A far-reaching concept is the effect of boundaries and the corresponding boundary conditions. They are unavoidable in almost all models of mathematical physics and are of the fundamental importance. The introduction of boundaries into the theory of quantum groups leads to a whole new class of the so-called reflection algebras. Such algebras were shown to appear in numerous mathematical models and are at the core of the integrable structure of them. Furthermore, these algebras were also shown to play a prominent role in the AdS/CFT. However a coherent framework for describing such algebras is not known, and many properties of the reflection algebras are still an open question. The goal of this research is to develop new algebraic methods and intradisciplinary connections between the axiomatic theory of algebras and the theory of quantum groups inspired by the integrable structure of the AdS/CFT, in particular by shedding more light on the effects of boundaries and different boundary configurations. The research is driven by applying algebraic objects such as the quantum affine and Yangian algebras to find elegant, exact solutions describing the models that arise from and are inspired by the gauge/gravity dualities.

    visibility9
    visibilityviews9
    downloaddownloads67
    Powered by Usage counts
    more_vert
  • Funder: UKRI Project Code: EP/K033166/1
    Funder Contribution: 587,661 GBP

    Future deployments of wireless sensor network (WSN) infrastructures for environmental, industrial or event monitoring are expected to be equipped with energy harvesters (e.g. piezoelectric, thermal or photovoltaic) in order to substantially increase their autonomy and lifetime. However, it is also widely recognized that the existing gap between the sensors' energy availability and the sensors' energy consumption requirements is not likely to close in the near future due to limitations in current energy harvesting (EH) technology, together with the surge in demand for more data-intensive applications. Hence, perpetually operating WSNs are currently impossible to realize for data-intensive applications, as significant (and costly) human intervention is required to replace batteries. With the continuous improvement of energy efficiency representing a major drive in WSN research, the major objective of this research project is to develop transformative sensing mechanisms, which can be used in conjunction with current or upcoming EH capabilities, in order to enable the deployment of energy neutral or nearly energy neutral WSNs with practical network lifetime and data gathering rates up to two orders of magnitude higher than the current state-of-the-art. The theoretical foundations of the proposed research are the emerging paradigms of compressive sensing (CS) and distributed compressive sensing (DCS) as well as energy- and information-optimal data acquisition and transmission protocols. These elements offer the means to tightly couple the energy consumption process to the random nature of the energy harvesting process in a WSN in order to achieve the breakthroughs in network lifetime and data gathering rates. The proposed project brings together a team of theoreticians and experimentalists working in areas of the EPSRC ICT portfolio that have been identified for expansion. This team is well placed to be able to develop, implement and evaluate the novel WSN technology. The consortium also comprises a number of established and early stage companies that clearly view the project as one that will impact their medium and long term product developments and also strengthen their strategic links with world class academic institutions. We anticipate that a successful demonstration of the novel WSN technology will generate significant interest in the machine-to-machine (M2M) and Internet of Things (IoT) industries both in the UK and abroad.

    visibility24
    visibilityviews24
    downloaddownloads166
    Powered by Usage counts
    more_vert
  • Funder: UKRI Project Code: EP/L003309/1
    Funder Contribution: 973,522 GBP

    The overall aim of the proposed research is to enable the development and operation of new, agile, more cost-effective and sustainable chemical manufacturing processes. The future of sustainable chemicals manufacturing is in flexible, modular and intensive processes. New automated reaction tools and hardware are becoming ubiquitous but optimisation of how they are used and the methods of dealing with the larger amounts of experimental data available are still largely manual processes, and generally only carried out for long duration production runs. A crucial missing component is a fast automated closed-loop methodology for development and running of optimised chemicals manufacturing processes. This proposal will close this gap by developing an automated system for experimentation that brings together automated hardware for reaction execution, methods for reaction composition data acquisition and analysis, the intelligent selection of future experiments, and the development of process models in real-time. The multi-disciplinary challenge of this topic requires research in a variety of fields, including chemistry, statistics, engineering, chemometrics and computer science. Each of the individual research questions are novel and substantial challenges in their own right; their fusion will allow the automatic optimisation of reaction chemistry for a variety of applications and on a variety of different scales. Such a system would become a key tool in both academic and industrial chemistry, making feasible the routine manufacture of even small amounts of material via optimised processes, and increasing the efficiency of processes on all scales. Hence, it has the potential to enable new ways of working towards sustainable and green chemistry.

    visibility12
    visibilityviews12
    downloaddownloads48
    Powered by Usage counts
    more_vert
  • Funder: UKRI Project Code: EP/K006835/1
    Funder Contribution: 354,296 GBP

    The global market for lithium-ion batteries is expected to increase from an estimated $8bn in 2008 to $30bn by 2017, according to independent market analyst Takeshita. Lithium-air or lithium-oxygen batteries are an important technology for future energy storage because they have theoretical energy densities that are almost an order of magnitude greater than the state-of-the-art Li-ion battery. The energy storage needs of society in the long-term are likely to demand batteries for both stationary power storage to collect unwanted energy generated from wind farms and batteries to power electric vehicles. The success of these technologies underpins the UK's need to move to a lower carbon and greener economy which is less reliant on carbon dioxide generating fossil fuels. The development of lithium-oxygen batteries is being hampered by lack of understanding of the complexity of products formed on the air-cathode during reduction and oxidation. Spectroscopy is critical for identification of products and the understanding of the chemistry at the interface of electrodes. Moreover advanced in situ spectroelectrochemical techniques help us to comprehend these complex interfaces whilst under full electrochemical control. A particularly sensitive technique, surface-enhanced infrared absorption spectroscopy (SEIRAS) has not been applied to these systems. Furthermore development of in situ far-IR spectroscopy would enable us to identify lithium-oxygen compounds at these low frequencies. The goal of this proposal is therefore to further the progress of lithium-oxygen technology by fully understanding the reduction and oxidation pathways taking place within the battery and to comprehend the role of electrocatalytic surfaces.

    visibility10
    visibilityviews10
    downloaddownloads18
    Powered by Usage counts
    more_vert
  • Funder: UKRI Project Code: EP/J019720/1
    Funder Contribution: 664,248 GBP

    New ideas for carbon capture are urgently needed to combat climate change. Retro-fitting post-combustion carbon capture to existing power plants has the greatest potential to reduce CO2 emissions considering these sources make the largest contribution to CO2 emissions in the UK. Unfortunately, carbon capture methods based on existing industrial process technology for separation of CO2 from natural gas streams (i.e. amine scrubbing) would be extremely expensive if applied on the scale envisaged, as exemplified by the recent collapse of the Government's CCS project at Longannet power station. Moreover, many of the chemical absorbents used, typically amines, are corrosive and toxic and their use could generate significant amounts of hazardous waste. So, more efficient and 'greener' post-combustion CCS technologies are urgently needed if CCS is to be adopted on a global scale. Efficient separation of CO2 from flue gases requires at least the following; i) an inexpensive sorbent with high CO2 working capacity and selectivity, ii) high rates of CO2 mass transfer into and out of the sorbent, and iii) a low energy cost for sorbent regeneration. A traditional aqueous amine scrubbing process has high selectivity, but is less effective in terms of capacity, mass transfer rate, and sorbent regeneration energy penalty. Here, we propose to investigate a novel process based on the 'wetting layer absorption' (WLA) concept in which a porous material is used to support liquid-like regions of absorbing solvent, which in turn absorb the gas of interest, in this case carbon dioxide. This process, recently invented by one of the authors (MS) of this proposal at Strathclyde, is being pioneered by researchers in Scotland. Initial work involved investigation of the use of physical solvents. Here the focus is on a process involving chemical solvents, i.e. amines. This process should have a high capacity, high slectivity, and high rates of mass transfer. Another novel aspect of this work is the investigation of microwave regeneration, which could also result in much reduced costs for sorbent regeneration. Finally, the process would involve orders of magnitude reductions in solvent recycling, and could make use of much less toxic and corrosive solvents, leading to a much greener process. Ultimately, the WLA process involving chemical solvents could potentially significantly reduce the cost and environmental impact of carbon capture.

    more_vert
  • Funder: UKRI Project Code: EP/J010790/1
    Funder Contribution: 613,852 GBP

    String theory is believed to be a theory capable of describing all the known forces of nature, and provides a solution to the venerable problem of finding a theory of gravity consistent with quantum mechanics. To a first approximation, the world we observe corresponds to a vacuum of this theory. String theory admits many of these vacuum states and the class that is most likely to describe the observed world are the so-called `heterotic vacua'. Analysing these vacua requires the application of sophisticated tools drawn from mathematics, particularly from algebraic geometry. If history is any guide, the synthesis of these mathematical tools with observations drawn from physics will lead not only to significant progress in physics, but also important advances in mathematics. An example of such a major insight in mathematics, that arose from string theory, is mirror symmetry. This is the observation that within in a restricted class of string vacua, these arise in `mirror pairs'. This has the consequence that certain mathematical quantities, which are both important and otherwise mysterious, can be calculated in a straightforward manner. The class of heterotic vacua, of interest here, are a wider class of vacua, and an important question is to what extent mirror symmetry generalises and how it acts on this wider class. In a more precise description, the space of heterotic vacua is the parameter space of pairs (X,V) where X is a Calabi-Yau manifold and V is a stable holomorphic vector bundle on X. This space is a major object of study in algebra and geometry. String theory tells us that it is subject to quantum corrections. To understand the nature of these corrections is the key research problem in this proposal and any advance in our understanding will have a important impact in both mathematics and physics. By now it is widely understood that string theory and geometry are intimately related with much to be learned from each other, yet this relationship is relatively unexplored in the heterotic string. This fact, together with recent developments that indicate that longstanding problems have recently become tractable, means that the time is right to revisit the geometry of heterotic vacua.

    visibility2
    visibilityviews2
    downloaddownloads2
    Powered by Usage counts
    more_vert
  • Funder: UKRI Project Code: EP/K009850/1
    Funder Contribution: 158,970 GBP

    We are in the midst of an information revolution, where advances in science and technology, as well as the day-to-day operation of successful organisations and businesses, are increasingly reliant on the analyses of data. Driving these advances is a deluge of data, which is far outstripping the increase in computational power available. The importance of managing, analysing, and deriving useful understanding from such large scale data is highlighted by high-profile reports by McKinsey and The Economist as well as other outlets, and by the EPSRC's recent ICT priority of "Towards an Intelligent Information Infrastructure". Bayesian analysis is one of the most successful family of methods for analysing data, and one now widely adopted in the statistical sciences as well as in AI technologies like machine learning. The Bayesian approach offers a number of attractive advantages over other methods: flexibility in constructing complex models from simple parts; fully coherent inferences from data; natural incorporation of prior knowledge; explicit modelling assumptions; precise reasoning of uncertainties over model order and parameters; and protection against overfitting. On the other hand, there is a general perception that they can be too slow to be practically useful on big data sets. This is because exact Bayesian computations are typically intractable, so a range of more practical approximate algorithms are needed, including variational approximations, sequential Monte Carlo (SMC) and Markov chain Monte Carlo (MCMC). MCMC methods arguably form the most popular class of Bayesian computational techniques, due to their flexibility, general applicability and asymptotic exactness. Unfortunately, MCMC methods do not scale well to big data sets, since they require many iterations to reduce Monte Carlo noise, and each iteration already involves an expensive sweep through the whole data set. In this project we propose to develop the theoretical foundations for a new class of MCMC inference procedures that can scale to billions of data items, thus unlocking the strengths of Bayesian methods for big data. The basic idea is to use a small subset of the data during each parameter update iteration of the algorithm, so that many iterations can be performed cheaply. This introduces excess stochasticity in the algorithm, which can be controlled by annealing the update step sizes towards zero as the number of iterations increases. The resulting algorithm is a cross between an MCMC and a stochastic optimization algorithm. An initial exploration of this procedure, which we call stochastic gradient Langevin dynamics (SGLD), was initiated by us recently (Welling and Teh, ICML 2011). Our proposal is to lay the mathematical foundations for understanding the theoretical properties of such stochastic MCMC algorithms, and to build on these foundations to develop more sophisticated algorithms. We aim to understand the conditions under which the algorithm is guaranteed to converge, and the type and speed of convergence. Using this understanding, we aim to develop algorithmic extensions and generalizations with better convergence properties, including preconditioning, adaptive and Riemannian methods, Hamiltonian Monte Carlo methods, Online Bayesian learning methods, and approximate methods with large step sizes. These algorithms will be empirically validated on real world problems, including large scale data analysis problems for text processing and collaborative filtering which are standard problems in machine learning, and large scale data from ID Analytics, a partner company interested in detecting identity theft and fraud.

    visibility7
    visibilityviews7
    downloaddownloads14
    Powered by Usage counts
    more_vert
  • Funder: UKRI Project Code: EP/K011588/1
    Funder Contribution: 255,173 GBP

    Differential geometry is the study of "smooth shapes", e.g. curved surfaces that have no rough edges or sharp bends. A surface is a 2-dimensional object, and one can similarly imagine smooth shapes that are 1-dimensional, such as a line, or curve, or circle. What is much harder to imagine, but can nonetheless be described in precise mathematical terms, is a smooth shape in an arbitrary number of dimensions: these objects are called "manifolds". A specific example of a 2-dimensional manifold is a disk, i.e. the region inside a circle, and its "boundary" is a 1-dimensional manifold, namely the circle. Similarly, for any positive integer n, an n-dimensional manifold may have a boundary which is an (n-1)-dimensional manifold. All the 3-dimensional manifolds that we can easily picture are of this type: e.g. if we imagine any surface in 3-dimensional space, such as a sphere or a "torus" (the shape of the surface of a doughnut), then the region inside that surface is a 3-dimensional manifold whose boundary is the surface. We can now ask one of the most basic questions concerning manifolds: given an n-dimensional manifold, is it the boundary of something? This is actually not just a geometric question, but really a question of "topology", which is a certain way of studying the "overall shape" of geometric objects. As in the example given above, most 2-dimensional manifolds that we can easily imagine are boundaries of the 3-dimensional regions they enclose. But for a more interesting example, we can try to imagine a "Klein bottle": this is a surface formed by taking an ordinary bottle and bending its opening around and through the glass into the inside, then connecting the opening to the floor of the bottle by curving the floor upward. The result is a surface that is not a boundary of anything, as its inside is not distinct from its outside; like a Moebius strip, but closed in on itself. The subject of this proposal concerns a more elaborate version of the above question about boundaries: we deal with a particular type of manifold in an even number of dimensions, called "symplectic" manifolds, and their odd-dimensional boundaries are called "contact" manifolds. The idea of a symplectic manifold comes originally from physics: a century ago, symplectic manifolds were understood to be the natural geometric setting in which to study Hamilton's 19th century reformulation of Newton's classical mechanics. Today symplectic manifolds are considered interesting in their own right, and they retain a connection to physics, but of a very different and non-classical sort: by studying certain special surfaces in symplectic manifolds with contact boundary, one can define a so-called "Symplectic Field Theory" (or "SFT" for short), which bears a strong but mysterious resemblance to some of the theories that modern physics uses to describe elementary particles and their interactions. Unlike those theories, SFT does not help us to predict what will happen in a particle accelerator, but it can help us answer a basic question in the area of "Symplectic and Contact Topology": given a contact manifold, is it the boundary of any symplectic manifold? More generally, one way to study contact manifolds themselves is to consider the following relation: we say that two such manifolds are "symplectically cobordant" if they form two separate pieces of the boundary of a symplectic manifold. The question of whether two given contact manifolds are cobordant helps us understand what kinds of contact manifolds can exist in the first place, and Symplectic Field Theory is one of the most powerful methods we have for studying this. The goal of this project is thus to use this and related tools to learn as much as we can about the symplectic cobordism relation on contact manifolds. Since most previous results on this subject have focused on 4-dimensional manifolds with 3-dimensional boundaries, we aim especially to gain new insights in higher dimensions.

    more_vert
  • Funder: UKRI Project Code: EP/I004130/2
    Funder Contribution: 322,634 GBP

    In homotopy theory, topological spaces (i.e. shapes) are regarded as being the same if we can deform continuously from one to the other. Algebraic varieties are spaces defined by polynomial equations, often over the complex numbers; studying their homotopy theory means trying to tell which topological spaces can be deformed continuously to get algebraic varieties, or when a continuous map between algebraic varieties can be continuously deformed to a map defined by polynomials.If the polynomials defining a variety are rational numbers (i.e. fractions), this automatically gives the complex variety a group of symmetries, called the Galois group. Although these symmetries are not continuous (i.e. nearby points can be sent far apart), they preserve something called the etale topology. This is an abstract concept which looks somewhat unnatural, butbehaves well enough to preserve many of the topological features of the variety. Part of my project will involve investigating how the Galois group interacts with the etale topology. I also study algebraic varieties in finite and mixed characteristics. Finite characteristics are universes in which the rules of arithmetic are modified by choosing a prime number p, and setting it to zero. For instance, in characteristic 3 the equation 1+1+1=0 holds. In mixed characteristic, p need not be 0, but the sequence 1,p, p^2, p^3 ... converges to 0.Although classical geometry of varieties does not make sense in finite and mixed characteristics, the etale topology provides a suitable alternative, allowing us to gain much valuable insight into the behaviour of the Galois group. This is an area which I find fascinating, as much topological intuition still works in contexts far removed from real and complex geometry. Indeed, many results in complex geometry have been motivated by phenomena observed in finite characteristic.Moduli spaces parametrise classes of geometric objects, and can themselves often be given geometric structures, similar to those of algebraic varieties. This structure tends to misbehave at points parametrising objects with a lot of symmetry. To obviate this difficulty, algebraic geometers work with moduli stacks, which parametrise the symmetries as well as the objects. Sometimes the symmetries can themselves have symmetries and so on, giving rise to infinity stacks.Usually, the dimension of a moduli stack can be calculated by naively counting the degrees of freedom in defining the geometric object it parametrises. However, the space usually contains singularities (points where the space is not smooth), and regions of different dimensions. Partially inspired by ideas from theoretical physics, it has been conjectured that every moduli stack can be extended to a derived moduli stack, which would have the expected dimension, but with some of the dimensions only virtual. Extending to these virtual dimensions also removes the singularities, a phenomenon known as hidden smoothness . Different classification problems can give rise to the same moduli stack, but different derived moduli stacks. Much of my work will be to try to construct derived moduli stacks for a large class of problems. This has important applications in algebraic geometry, as there are many problems for which the moduli stacks are unmanageable, but which should become accessible using derived moduli stacks. I will also seek to investigate the geometry and behaviour of derived stacks themselves.A common thread through the various aspects of my project will be to find ways of applying powerful ideas and techniques from a branch of topology, namely homotopy theory, in contexts where they would not, at first sight, appear to be relevant.

    visibility2
    visibilityviews2
    downloaddownloads1
    Powered by Usage counts
    more_vert
  • Funder: UKRI Project Code: EP/J019844/1
    Funder Contribution: 263,385 GBP

    Organic molecular monolayers at surfaces often constitute the central working component in nanotechnologies such as sensors, molecular electronics, smart coatings, organic solar cells, catalysts, medical devices, etc. A central challenge in the field is to achieve controlled creation of desired 2D molecular architectures at surfaces. Within this context, the past decade has witnessed a real and significant step-change in the 'bottom-up' self-organisation of 2D molecular assemblies at surfaces. The enormous variety and abundance of molecular structures formed via self-oeganisation has now critically tipped the argument strongly in favour of a 'bottom-up' construction strategy, which harnesses two powerful attributes of nanometer-precision (inaccessible to top-down methods) and highly parallel fabrication (impossible with atomic/molecular manipulation). Thus, bottom-up molecular assembly at surfaces holds the real possibility of becoming a dominating synthesis protocol in 21st century nanotechnologies Uniquely, the scope and versatility of these molecular architectures at 2D surfaces have been directly captured at the nanoscale via imaging with scanning probe microscopies and advanced surface spectroscopies. At present, however, the field is largely restricted to a 'make and see' approach and there is scarce understanding of any of the parameters that ultimately control molecular surface assembly. For example: (1) molecular assemblies at surfaces show highly polymorphic behaviour, and a priori control of assembly is practically non-existent; (2) little is understood of the influence and balance of the many interactions that drive molecular recognition and assembly (molecule-molecule interactions including dispersion, directional H-bonding and strong electrostatic and covalent interactions); (3) the role of surface-molecule interactions is largely uncharted even though they play a significant role in the diffusion of molecules and their subsequent assembly; (4), there is ample evidence that the kinetics of self-assembly is the major factor in determining the final structure, often driving polymorphic behaviour and leading to widely varied outcomes, depending on the conditions of formation; (5) a gamut of additional surface phenomena also also influence assembly e.g. chemical reactions between molecules, thermally activated internal degrees of freedom of molecules, surface reconstructions and co-assembly via coordinating surface atoms. The main objective of this project is to advance from experimental phenomena-reporting to knowledge-based design, and its central goal is to identify the role played by thermodynamic, entropic, kinetic and chemical factors in dictating molecular organisation at surfaces under given experimental conditions. To address this challenge requires a two-pronged approach in which ambitious and comprehensive theory development is undertaken alongside powerful imaging and spectroscopic tools applied to the same systems. This synergy of experiment and theory is absolutely essential to develop a fundamental understanding, which would enable a roadmap for controlled and engineered self-assembly at surfaces to be proposed that would, ultimately, allow one to 'dial up' a required structure at will. Four important and qualitatively different classes of assembly at surfaces will be studied: Molecular Self-Assembly; Hierarchical Self-Assembly; Metal-Organic Self Assembly; and, on-surface Covalent Assembly.

    visibility23
    visibilityviews23
    downloaddownloads22
    Powered by Usage counts
    more_vert
Advanced search in
Projects
arrow_drop_down
Searching FieldsTerms
Any field
arrow_drop_down
includes
arrow_drop_down
377 Projects
  • Funder: UKRI Project Code: EP/K031805/1
    Funder Contribution: 221,071 GBP

    During the last twenty years mathematics and physics have significantly influenced each other and became highly entangled. Mathematical physics was always producing a wide variety of new concepts and problems that became important subjects of the pure mathematical research. The growth of gauge, gravity and string theories have made the relation between these subjects closer than ever before. An important driving force was the discovery of quantum groups and of the gauge/gravity dualities. Here the leading role was played by the the so-called AdS/CFT duality and the underlying integrable structure of it. A far-reaching concept is the effect of boundaries and the corresponding boundary conditions. They are unavoidable in almost all models of mathematical physics and are of the fundamental importance. The introduction of boundaries into the theory of quantum groups leads to a whole new class of the so-called reflection algebras. Such algebras were shown to appear in numerous mathematical models and are at the core of the integrable structure of them. Furthermore, these algebras were also shown to play a prominent role in the AdS/CFT. However a coherent framework for describing such algebras is not known, and many properties of the reflection algebras are still an open question. The goal of this research is to develop new algebraic methods and intradisciplinary connections between the axiomatic theory of algebras and the theory of quantum groups inspired by the integrable structure of the AdS/CFT, in particular by shedding more light on the effects of boundaries and different boundary configurations. The research is driven by applying algebraic objects such as the quantum affine and Yangian algebras to find elegant, exact solutions describing the models that arise from and are inspired by the gauge/gravity dualities.

    visibility9
    visibilityviews9
    downloaddownloads67
    Powered by Usage counts
    more_vert
  • Funder: UKRI Project Code: EP/K033166/1
    Funder Contribution: 587,661 GBP

    Future deployments of wireless sensor network (WSN) infrastructures for environmental, industrial or event monitoring are expected to be equipped with energy harvesters (e.g. piezoelectric, thermal or photovoltaic) in order to substantially increase their autonomy and lifetime. However, it is also widely recognized that the existing gap between the sensors' energy availability and the sensors' energy consumption requirements is not likely to close in the near future due to limitations in current energy harvesting (EH) technology, together with the surge in demand for more data-intensive applications. Hence, perpetually operating WSNs are currently impossible to realize for data-intensive applications, as significant (and costly) human intervention is required to replace batteries. With the continuous improvement of energy efficiency representing a major drive in WSN research, the major objective of this research project is to develop transformative sensing mechanisms, which can be used in conjunction with current or upcoming EH capabilities, in order to enable the deployment of energy neutral or nearly energy neutral WSNs with practical network lifetime and data gathering rates up to two orders of magnitude higher than the current state-of-the-art. The theoretical foundations of the proposed research are the emerging paradigms of compressive sensing (CS) and distributed compressive sensing (DCS) as well as energy- and information-optimal data acquisition and transmission protocols. These elements offer the means to tightly couple the energy consumption process to the random nature of the energy harvesting process in a WSN in order to achieve the breakthroughs in network lifetime and data gathering rates. The proposed project brings together a team of theoreticians and experimentalists working in areas of the EPSRC ICT portfolio that have been identified for expansion. This team is well placed to be able to develop, implement and evaluate the novel WSN technology. The consortium also comprises a number of established and early stage companies that clearly view the project as one that will impact their medium and long term product developments and also strengthen their strategic links with world class academic institutions. We anticipate that a successful demonstration of the novel WSN technology will generate significant interest in the machine-to-machine (M2M) and Internet of Things (IoT) industries both in the UK and abroad.

    visibility24
    visibilityviews24
    downloaddownloads166
    Powered by Usage counts
    more_vert
  • Funder: UKRI Project Code: EP/L003309/1
    Funder Contribution: 973,522 GBP

    The overall aim of the proposed research is to enable the development and operation of new, agile, more cost-effective and sustainable chemical manufacturing processes. The future of sustainable chemicals manufacturing is in flexible, modular and intensive processes. New automated reaction tools and hardware are becoming ubiquitous but optimisation of how they are used and the methods of dealing with the larger amounts of experimental data available are still largely manual processes, and generally only carried out for long duration production runs. A crucial missing component is a fast automated closed-loop methodology for development and running of optimised chemicals manufacturing processes. This proposal will close this gap by developing an automated system for experimentation that brings together automated hardware for reaction execution, methods for reaction composition data acquisition and analysis, the intelligent selection of future experiments, and the development of process models in real-time. The multi-disciplinary challenge of this topic requires research in a variety of fields, including chemistry, statistics, engineering, chemometrics and computer science. Each of the individual research questions are novel and substantial challenges in their own right; their fusion will allow the automatic optimisation of reaction chemistry for a variety of applications and on a variety of different scales. Such a system would become a key tool in both academic and industrial chemistry, making feasible the routine manufacture of even small amounts of material via optimised processes, and increasing the efficiency of processes on all scales. Hence, it has the potential to enable new ways of working towards sustainable and green chemistry.

    visibility12
    visibilityviews12
    downloaddownloads48
    Powered by Usage counts
    more_vert
  • Funder: UKRI Project Code: EP/K006835/1
    Funder Contribution: 354,296 GBP

    The global market for lithium-ion batteries is expected to increase from an estimated $8bn in 2008 to $30bn by 2017, according to independent market analyst Takeshita. Lithium-air or lithium-oxygen batteries are an important technology for future energy storage because they have theoretical energy densities that are almost an order of magnitude greater than the state-of-the-art Li-ion battery. The energy storage needs of society in the long-term are likely to demand batteries for both stationary power storage to collect unwanted energy generated from wind farms and batteries to power electric vehicles. The success of these technologies underpins the UK's need to move to a lower carbon and greener economy which is less reliant on carbon dioxide generating fossil fuels. The development of lithium-oxygen batteries is being hampered by lack of understanding of the complexity of products formed on the air-cathode during reduction and oxidation. Spectroscopy is critical for identification of products and the understanding of the chemistry at the interface of electrodes. Moreover advanced in situ spectroelectrochemical techniques help us to comprehend these complex interfaces whilst under full electrochemical control. A particularly sensitive technique, surface-enhanced infrared absorption spectroscopy (SEIRAS) has not been applied to these systems. Furthermore development of in situ far-IR spectroscopy would enable us to identify lithium-oxygen compounds at these low frequencies. The goal of this proposal is therefore to further the progress of lithium-oxygen technology by fully understanding the reduction and oxidation pathways taking place within the battery and to comprehend the role of electrocatalytic surfaces.

    visibility10
    visibilityviews10
    downloaddownloads18
    Powered by Usage counts
    more_vert
  • Funder: UKRI Project Code: EP/J019720/1
    Funder Contribution: 664,248 GBP

    New ideas for carbon capture are urgently needed to combat climate change. Retro-fitting post-combustion carbon capture to existing power plants has the greatest potential to reduce CO2 emissions considering these sources make the largest contribution to CO2 emissions in the UK. Unfortunately, carbon capture methods based on existing industrial process technology for separation of CO2 from natural gas streams (i.e. amine scrubbing) would be extremely expensive if applied on the scale envisaged, as exemplified by the recent collapse of the Government's CCS project at Longannet power station. Moreover, many of the chemical absorbents used, typically amines, are corrosive and toxic and their use could generate significant amounts of hazardous waste. So, more efficient and 'greener' post-combustion CCS technologies are urgently needed if CCS is to be adopted on a global scale. Efficient separation of CO2 from flue gases requires at least the following; i) an inexpensive sorbent with high CO2 working capacity and selectivity, ii) high rates of CO2 mass transfer into and out of the sorbent, and iii) a low energy cost for sorbent regeneration. A traditional aqueous amine scrubbing process has high selectivity, but is less effective in terms of capacity, mass transfer rate, and sorbent regeneration energy penalty. Here, we propose to investigate a novel process based on the 'wetting layer absorption' (WLA) concept in which a porous material is used to support liquid-like regions of absorbing solvent, which in turn absorb the gas of interest, in this case carbon dioxide. This process, recently invented by one of the authors (MS) of this proposal at Strathclyde, is being pioneered by researchers in Scotland. Initial work involved investigation of the use of physical solvents. Here the focus is on a process involving chemical solvents, i.e. amines. This process should have a high capacity, high slectivity, and high rates of mass transfer. Another novel aspect of this work is the investigation of microwave regeneration, which could also result in much reduced costs for sorbent regeneration. Finally, the process would involve orders of magnitude reductions in solvent recycling, and could make use of much less toxic and corrosive solvents, leading to a much greener process. Ultimately, the WLA process involving chemical solvents could potentially significantly reduce the cost and environmental impact of carbon capture.

    more_vert
  • Funder: UKRI Project Code: EP/J010790/1
    Funder Contribution: 613,852 GBP

    String theory is believed to be a theory capable of describing all the known forces of nature, and provides a solution to the venerable problem of finding a theory of gravity consistent with quantum mechanics. To a first approximation, the world we observe corresponds to a vacuum of this theory. String theory admits many of these vacuum states and the class that is most likely to describe the observed world are the so-called `heterotic vacua'. Analysing these vacua requires the application of sophisticated tools drawn from mathematics, particularly from algebraic geometry. If history is any guide, the synthesis of these mathematical tools with observations drawn from physics will lead not only to significant progress in physics, but also important advances in mathematics. An example of such a major insight in mathematics, that arose from string theory, is mirror symmetry. This is the observation that within in a restricted class of string vacua, these arise in `mirror pairs'. This has the consequence that certain mathematical quantities, which are both important and otherwise mysterious, can be calculated in a straightforward manner. The class of heterotic vacua, of interest here, are a wider class of vacua, and an important question is to what extent mirror symmetry generalises and how it acts on this wider class. In a more precise description, the space of heterotic vacua is the parameter space of pairs (X,V) where X is a Calabi-Yau manifold and V is a stable holomorphic vector bundle on X. This space is a major object of study in algebra and geometry. String theory tells us that it is subject to quantum corrections. To understand the nature of these corrections is the key research problem in this proposal and any advance in our understanding will have a important impact in both mathematics and physics. By now it is widely understood that string theory and geometry are intimately related with much to be learned from each other, yet this relationship is relatively unexplored in the heterotic string. This fact, together with recent developments that indicate that longstanding problems have recently become tractable, means that the time is right to revisit the geometry of heterotic vacua.

    visibility2
    visibilityviews2
    downloaddownloads2
    Powered by Usage counts
    more_vert
  • Funder: UKRI Project Code: EP/K009850/1
    Funder Contribution: 158,970 GBP

    We are in the midst of an information revolution, where advances in science and technology, as well as the day-to-day operation of successful organisations and businesses, are increasingly reliant on the analyses of data. Driving these advances is a deluge of data, which is far outstripping the increase in computational power available. The importance of managing, analysing, and deriving useful understanding from such large scale data is highlighted by high-profile reports by McKinsey and The Economist as well as other outlets, and by the EPSRC's recent ICT priority of "Towards an Intelligent Information Infrastructure". Bayesian analysis is one of the most successful family of methods for analysing data, and one now widely adopted in the statistical sciences as well as in AI technologies like machine learning. The Bayesian approach offers a number of attractive advantages over other methods: flexibility in constructing complex models from simple parts; fully coherent inferences from data; natural incorporation of prior knowledge; explicit modelling assumptions; precise reasoning of uncertainties over model order and parameters; and protection against overfitting. On the other hand, there is a general perception that they can be too slow to be practically useful on big data sets. This is because exact Bayesian computations are typically intractable, so a range of more practical approximate algorithms are needed, including variational approximations, sequential Monte Carlo (SMC) and Markov chain Monte Carlo (MCMC). MCMC methods arguably form the most popular class of Bayesian computational techniques, due to their flexibility, general applicability and asymptotic exactness. Unfortunately, MCMC methods do not scale well to big data sets, since they require many iterations to reduce Monte Carlo noise, and each iteration already involves an expensive sweep through the whole data set. In this project we propose to develop the theoretical foundations for a new class of MCMC inference procedures that can scale to billions of data items, thus unlocking the strengths of Bayesian methods for big data. The basic idea is to use a small subset of the data during each parameter update iteration of the algorithm, so that many iterations can be performed cheaply. This introduces excess stochasticity in the algorithm, which can be controlled by annealing the update step sizes towards zero as the number of iterations increases. The resulting algorithm is a cross between an MCMC and a stochastic optimization algorithm. An initial exploration of this procedure, which we call stochastic gradient Langevin dynamics (SGLD), was initiated by us recently (Welling and Teh, ICML 2011). Our proposal is to lay the mathematical foundations for understanding the theoretical properties of such stochastic MCMC algorithms, and to build on these foundations to develop more sophisticated algorithms. We aim to understand the conditions under which the algorithm is guaranteed to converge, and the type and speed of convergence. Using this understanding, we aim to develop algorithmic extensions and generalizations with better convergence properties, including preconditioning, adaptive and Riemannian methods, Hamiltonian Monte Carlo methods, Online Bayesian learning methods, and approximate methods with large step sizes. These algorithms will be empirically validated on real world problems, including large scale data analysis problems for text processing and collaborative filtering which are standard problems in machine learning, and large scale data from ID Analytics, a partner company interested in detecting identity theft and fraud.

    visibility7
    visibilityviews7
    downloaddownloads14
    Powered by Usage counts
    more_vert
  • Funder: UKRI Project Code: EP/K011588/1
    Funder Contribution: 255,173 GBP

    Differential geometry is the study of "smooth shapes", e.g. curved surfaces that have no rough edges or sharp bends. A surface is a 2-dimensional object, and one can similarly imagine smooth shapes that are 1-dimensional, such as a line, or curve, or circle. What is much harder to imagine, but can nonetheless be described in precise mathematical terms, is a smooth shape in an arbitrary number of dimensions: these objects are called "manifolds". A specific example of a 2-dimensional manifold is a disk, i.e. the region inside a circle, and its "boundary" is a 1-dimensional manifold, namely the circle. Similarly, for any positive integer n, an n-dimensional manifold may have a boundary which is an (n-1)-dimensional manifold. All the 3-dimensional manifolds that we can easily picture are of this type: e.g. if we imagine any surface in 3-dimensional space, such as a sphere or a "torus" (the shape of the surface of a doughnut), then the region inside that surface is a 3-dimensional manifold whose boundary is the surface. We can now ask one of the most basic questions concerning manifolds: given an n-dimensional manifold, is it the boundary of something? This is actually not just a geometric question, but really a question of "topology", which is a certain way of studying the "overall shape" of geometric objects. As in the example given above, most 2-dimensional manifolds that we can easily imagine are boundaries of the 3-dimensional regions they enclose. But for a more interesting example, we can try to imagine a "Klein bottle": this is a surface formed by taking an ordinary bottle and bending its opening around and through the glass into the inside, then connecting the opening to the floor of the bottle by curving the floor upward. The result is a surface that is not a boundary of anything, as its inside is not distinct from its outside; like a Moebius strip, but closed in on itself. The subject of this proposal concerns a more elaborate version of the above question about boundaries: we deal with a particular type of manifold in an even number of dimensions, called "symplectic" manifolds, and their odd-dimensional boundaries are called "contact" manifolds. The idea of a symplectic manifold comes originally from physics: a century ago, symplectic manifolds were understood to be the natural geometric setting in which to study Hamilton's 19th century reformulation of Newton's classical mechanics. Today symplectic manifolds are considered interesting in their own right, and they retain a connection to physics, but of a very different and non-classical sort: by studying certain special surfaces in symplectic manifolds with contact boundary, one can define a so-called "Symplectic Field Theory" (or "SFT" for short), which bears a strong but mysterious resemblance to some of the theories that modern physics uses to describe elementary particles and their interactions. Unlike those theories, SFT does not help us to predict what will happen in a particle accelerator, but it can help us answer a basic question in the area of "Symplectic and Contact Topology": given a contact manifold, is it the boundary of any symplectic manifold? More generally, one way to study contact manifolds themselves is to consider the following relation: we say that two such manifolds are "symplectically cobordant" if they form two separate pieces of the boundary of a symplectic manifold. The question of whether two given contact manifolds are cobordant helps us understand what kinds of contact manifolds can exist in the first place, and Symplectic Field Theory is one of the most powerful methods we have for studying this. The goal of this project is thus to use this and related tools to learn as much as we can about the symplectic cobordism relation on contact manifolds. Since most previous results on this subject have focused on 4-dimensional manifolds with 3-dimensional boundaries, we aim especially to gain new insights in higher dimensions.

    more_vert
  • Funder: UKRI Project Code: EP/I004130/2
    Funder Contribution: 322,634 GBP

    In homotopy theory, topological spaces (i.e. shapes) are regarded as being the same if we can deform continuously from one to the other. Algebraic varieties are spaces defined by polynomial equations, often over the complex numbers; studying their homotopy theory means trying to tell which topological spaces can be deformed continuously to get algebraic varieties, or when a continuous map between algebraic varieties can be continuously deformed to a map defined by polynomials.If the polynomials defining a variety are rational numbers (i.e. fractions), this automatically gives the complex variety a group of symmetries, called the Galois group. Although these symmetries are not continuous (i.e. nearby points can be sent far apart), they preserve something called the etale topology. This is an abstract concept which looks somewhat unnatural, butbehaves well enough to preserve many of the topological features of the variety. Part of my project will involve investigating how the Galois group interacts with the etale topology. I also study algebraic varieties in finite and mixed characteristics. Finite characteristics are universes in which the rules of arithmetic are modified by choosing a prime number p, and setting it to zero. For instance, in characteristic 3 the equation 1+1+1=0 holds. In mixed characteristic, p need not be 0, but the sequence 1,p, p^2, p^3 ... converges to 0.Although classical geometry of varieties does not make sense in finite and mixed characteristics, the etale topology provides a suitable alternative, allowing us to gain much valuable insight into the behaviour of the Galois group. This is an area which I find fascinating, as much topological intuition still works in contexts far removed from real and complex geometry. Indeed, many results in complex geometry have been motivated by phenomena observed in finite characteristic.Moduli spaces parametrise classes of geometric objects, and can themselves often be given geometric structures, similar to those of algebraic varieties. This structure tends to misbehave at points parametrising objects with a lot of symmetry. To obviate this difficulty, algebraic geometers work with moduli stacks, which parametrise the symmetries as well as the objects. Sometimes the symmetries can themselves have symmetries and so on, giving rise to infinity stacks.Usually, the dimension of a moduli stack can be calculated by naively counting the degrees of freedom in defining the geometric object it parametrises. However, the space usually contains singularities (points where the space is not smooth), and regions of different dimensions. Partially inspired by ideas from theoretical physics, it has been conjectured that every moduli stack can be extended to a derived moduli stack, which would have the expected dimension, but with some of the dimensions only virtual. Extending to these virtual dimensions also removes the singularities, a phenomenon known as hidden smoothness . Different classification problems can give rise to the same moduli stack, but different derived moduli stacks. Much of my work will be to try to construct derived moduli stacks for a large class of problems. This has important applications in algebraic geometry, as there are many problems for which the moduli stacks are unmanageable, but which should become accessible using derived moduli stacks. I will also seek to investigate the geometry and behaviour of derived stacks themselves.A common thread through the various aspects of my project will be to find ways of applying powerful ideas and techniques from a branch of topology, namely homotopy theory, in contexts where they would not, at first sight, appear to be relevant.

    visibility2
    visibilityviews2
    downloaddownloads1
    Powered by Usage counts
    more_vert
  • Funder: UKRI Project Code: EP/J019844/1
    Funder Contribution: 263,385 GBP

    Organic molecular monolayers at surfaces often constitute the central working component in nanotechnologies such as sensors, molecular electronics, smart coatings, organic solar cells, catalysts, medical devices, etc. A central challenge in the field is to achieve controlled creation of desired 2D molecular architectures at surfaces. Within this context, the past decade has witnessed a real and significant step-change in the 'bottom-up' self-organisation of 2D molecular assemblies at surfaces. The enormous variety and abundance of molecular structures formed via self-oeganisation has now critically tipped the argument strongly in favour of a 'bottom-up' construction strategy, which harnesses two powerful attributes of nanometer-precision (inaccessible to top-down methods) and highly parallel fabrication (impossible with atomic/molecular manipulation). Thus, bottom-up molecular assembly at surfaces holds the real possibility of becoming a dominating synthesis protocol in 21st century nanotechnologies Uniquely, the scope and versatility of these molecular architectures at 2D surfaces have been directly captured at the nanoscale via imaging with scanning probe microscopies and advanced surface spectroscopies. At present, however, the field is largely restricted to a 'make and see' approach and there is scarce understanding of any of the parameters that ultimately control molecular surface assembly. For example: (1) molecular assemblies at surfaces show highly polymorphic behaviour, and a priori control of assembly is practically non-existent; (2) little is understood of the influence and balance of the many interactions that drive molecular recognition and assembly (molecule-molecule interactions including dispersion, directional H-bonding and strong electrostatic and covalent interactions); (3) the role of surface-molecule interactions is largely uncharted even though they play a significant role in the diffusion of molecules and their subsequent assembly; (4), there is ample evidence that the kinetics of self-assembly is the major factor in determining the final structure, often driving polymorphic behaviour and leading to widely varied outcomes, depending on the conditions of formation; (5) a gamut of additional surface phenomena also also influence assembly e.g. chemical reactions between molecules, thermally activated internal degrees of freedom of molecules, surface reconstructions and co-assembly via coordinating surface atoms. The main objective of this project is to advance from experimental phenomena-reporting to knowledge-based design, and its central goal is to identify the role played by thermodynamic, entropic, kinetic and chemical factors in dictating molecular organisation at surfaces under given experimental conditions. To address this challenge requires a two-pronged approach in which ambitious and comprehensive theory development is undertaken alongside powerful imaging and spectroscopic tools applied to the same systems. This synergy of experiment and theory is absolutely essential to develop a fundamental understanding, which would enable a roadmap for controlled and engineered self-assembly at surfaces to be proposed that would, ultimately, allow one to 'dial up' a required structure at will. Four important and qualitatively different classes of assembly at surfaces will be studied: Molecular Self-Assembly; Hierarchical Self-Assembly; Metal-Organic Self Assembly; and, on-surface Covalent Assembly.

    visibility23
    visibilityviews23
    downloaddownloads22
    Powered by Usage counts
    more_vert