#### Filters

Clear All#### Loading

assignment_turned_in __Project__2013 - 2016 University of EdinburghFunder: UKRI Project Code: EP/J019720/1Funder Contribution: 664,248 GBPNew ideas for carbon capture are urgently needed to combat climate change. Retro-fitting post-combustion carbon capture to existing power plants has the greatest potential to reduce CO2 emissions considering these sources make the largest contribution to CO2 emissions in the UK. Unfortunately, carbon capture methods based on existing industrial process technology for separation of CO2 from natural gas streams (i.e. amine scrubbing) would be extremely expensive if applied on the scale envisaged, as exemplified by the recent collapse of the Government's CCS project at Longannet power station. Moreover, many of the chemical absorbents used, typically amines, are corrosive and toxic and their use could generate significant amounts of hazardous waste. So, more efficient and 'greener' post-combustion CCS technologies are urgently needed if CCS is to be adopted on a global scale. Efficient separation of CO2 from flue gases requires at least the following; i) an inexpensive sorbent with high CO2 working capacity and selectivity, ii) high rates of CO2 mass transfer into and out of the sorbent, and iii) a low energy cost for sorbent regeneration. A traditional aqueous amine scrubbing process has high selectivity, but is less effective in terms of capacity, mass transfer rate, and sorbent regeneration energy penalty. Here, we propose to investigate a novel process based on the 'wetting layer absorption' (WLA) concept in which a porous material is used to support liquid-like regions of absorbing solvent, which in turn absorb the gas of interest, in this case carbon dioxide. This process, recently invented by one of the authors (MS) of this proposal at Strathclyde, is being pioneered by researchers in Scotland. Initial work involved investigation of the use of physical solvents. Here the focus is on a process involving chemical solvents, i.e. amines. This process should have a high capacity, high slectivity, and high rates of mass transfer. Another novel aspect of this work is the investigation of microwave regeneration, which could also result in much reduced costs for sorbent regeneration. Finally, the process would involve orders of magnitude reductions in solvent recycling, and could make use of much less toxic and corrosive solvents, leading to a much greener process. Ultimately, the WLA process involving chemical solvents could potentially significantly reduce the cost and environmental impact of carbon capture.

Do the share buttons not appear? Please make sure, any blocking addon is disabled, and then reload the page.All Research productsarrow_drop_down `<script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=ukri________::42de8f16c403eaf535581452bd58031d&type=result"></script>'); --> </script>`

For further information contact us at__helpdesk@openaire.eu__assignment_turned_in __Project__2013 - 2016 University of SurreyFunder: UKRI Project Code: EP/K031805/1Funder Contribution: 221,071 GBPDuring the last twenty years mathematics and physics have significantly influenced each other and became highly entangled. Mathematical physics was always producing a wide variety of new concepts and problems that became important subjects of the pure mathematical research. The growth of gauge, gravity and string theories have made the relation between these subjects closer than ever before. An important driving force was the discovery of quantum groups and of the gauge/gravity dualities. Here the leading role was played by the the so-called AdS/CFT duality and the underlying integrable structure of it. A far-reaching concept is the effect of boundaries and the corresponding boundary conditions. They are unavoidable in almost all models of mathematical physics and are of the fundamental importance. The introduction of boundaries into the theory of quantum groups leads to a whole new class of the so-called reflection algebras. Such algebras were shown to appear in numerous mathematical models and are at the core of the integrable structure of them. Furthermore, these algebras were also shown to play a prominent role in the AdS/CFT. However a coherent framework for describing such algebras is not known, and many properties of the reflection algebras are still an open question. The goal of this research is to develop new algebraic methods and intradisciplinary connections between the axiomatic theory of algebras and the theory of quantum groups inspired by the integrable structure of the AdS/CFT, in particular by shedding more light on the effects of boundaries and different boundary configurations. The research is driven by applying algebraic objects such as the quantum affine and Yangian algebras to find elegant, exact solutions describing the models that arise from and are inspired by the gauge/gravity dualities.

Do the share buttons not appear? Please make sure, any blocking addon is disabled, and then reload the page.All Research productsarrow_drop_down `<script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=ukri________::6d3745417e359718dbafc9dc91407007&type=result"></script>'); --> </script>`

For further information contact us at__helpdesk@openaire.eu__assignment_turned_in __Project__2013 - 2016 University of WarwickFunder: UKRI Project Code: EP/K030159/1Funder Contribution: 342,000 GBPUltrasound is used in many applications, including medical imaging, non-destructive evaluation, therapeutic ultrasound etc. In all these cases, there is usually a need for the formation of images or the creation of a focal region. Current methods for the generation of optimal acoustic fields generally rely on a linear process within the transducer. This linear transduction process influences the resultant properties in terms of spatial resolution and maximum intensity, noting that there are fundamental limits on the spatial resolution and power densities that can be achieved in such focal regions. In recent work in the area of acoustics, it has been demonstrated that a new type of acoustic signal can be generated via non-linear effects in chains of particles, which act as a kind of waveguide. These are based on the propagation of solitary waves. These have been studies at low frequencies, but this study will look at the posibility of using these new structures for use in biomedical ultrasound. Materials that support solitary waves are not used in standard ultrasonic work; little has been published on their use, despite the fact that a step change in performance may be possible. In this proposal, such waves will be generated within ultrasonic sources containing multiple solitary wave chains, at frequencies in the 500 kHz - 5 MHz range. To our knowledge, this has not been investigated before. Arrays are also possible, where each chain forms a single element. Because the chains would be primarily coupled along their length, but not laterally between each chains, issues arising from mechanical cross-coupling might be avoided. Pre-compression of each chain would alter the propagation velocity within it, so that beam-steering/focussing to be created. The propagation charaistics also change with signal amplitude, leading to the possibility of an acoustic diode. These new innovations would have applications in such areas as ultrasound-enhanced drug delivery, High Intensity Focussed Ultrasound (HIFU) for the treatment of tumours, and harmonic imaging.

Do the share buttons not appear? Please make sure, any blocking addon is disabled, and then reload the page.All Research productsarrow_drop_down `<script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=ukri________::33431d77174fcdbf50439421dbd42bb0&type=result"></script>'); --> </script>`

For further information contact us at__helpdesk@openaire.eu__assignment_turned_in __Project__2013 - 2016 KCLFunder: UKRI Project Code: EP/J019844/1Funder Contribution: 263,385 GBPOrganic molecular monolayers at surfaces often constitute the central working component in nanotechnologies such as sensors, molecular electronics, smart coatings, organic solar cells, catalysts, medical devices, etc. A central challenge in the field is to achieve controlled creation of desired 2D molecular architectures at surfaces. Within this context, the past decade has witnessed a real and significant step-change in the 'bottom-up' self-organisation of 2D molecular assemblies at surfaces. The enormous variety and abundance of molecular structures formed via self-oeganisation has now critically tipped the argument strongly in favour of a 'bottom-up' construction strategy, which harnesses two powerful attributes of nanometer-precision (inaccessible to top-down methods) and highly parallel fabrication (impossible with atomic/molecular manipulation). Thus, bottom-up molecular assembly at surfaces holds the real possibility of becoming a dominating synthesis protocol in 21st century nanotechnologies Uniquely, the scope and versatility of these molecular architectures at 2D surfaces have been directly captured at the nanoscale via imaging with scanning probe microscopies and advanced surface spectroscopies. At present, however, the field is largely restricted to a 'make and see' approach and there is scarce understanding of any of the parameters that ultimately control molecular surface assembly. For example: (1) molecular assemblies at surfaces show highly polymorphic behaviour, and a priori control of assembly is practically non-existent; (2) little is understood of the influence and balance of the many interactions that drive molecular recognition and assembly (molecule-molecule interactions including dispersion, directional H-bonding and strong electrostatic and covalent interactions); (3) the role of surface-molecule interactions is largely uncharted even though they play a significant role in the diffusion of molecules and their subsequent assembly; (4), there is ample evidence that the kinetics of self-assembly is the major factor in determining the final structure, often driving polymorphic behaviour and leading to widely varied outcomes, depending on the conditions of formation; (5) a gamut of additional surface phenomena also also influence assembly e.g. chemical reactions between molecules, thermally activated internal degrees of freedom of molecules, surface reconstructions and co-assembly via coordinating surface atoms. The main objective of this project is to advance from experimental phenomena-reporting to knowledge-based design, and its central goal is to identify the role played by thermodynamic, entropic, kinetic and chemical factors in dictating molecular organisation at surfaces under given experimental conditions. To address this challenge requires a two-pronged approach in which ambitious and comprehensive theory development is undertaken alongside powerful imaging and spectroscopic tools applied to the same systems. This synergy of experiment and theory is absolutely essential to develop a fundamental understanding, which would enable a roadmap for controlled and engineered self-assembly at surfaces to be proposed that would, ultimately, allow one to 'dial up' a required structure at will. Four important and qualitatively different classes of assembly at surfaces will be studied: Molecular Self-Assembly; Hierarchical Self-Assembly; Metal-Organic Self Assembly; and, on-surface Covalent Assembly.

All Research productsarrow_drop_down `<script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=ukri________::32219bc2fb44515320a606000bba0322&type=result"></script>'); --> </script>`

For further information contact us at__helpdesk@openaire.eu__assignment_turned_in __Project__2013 - 2016 University of OxfordFunder: UKRI Project Code: EP/K009850/1Funder Contribution: 158,970 GBPWe are in the midst of an information revolution, where advances in science and technology, as well as the day-to-day operation of successful organisations and businesses, are increasingly reliant on the analyses of data. Driving these advances is a deluge of data, which is far outstripping the increase in computational power available. The importance of managing, analysing, and deriving useful understanding from such large scale data is highlighted by high-profile reports by McKinsey and The Economist as well as other outlets, and by the EPSRC's recent ICT priority of "Towards an Intelligent Information Infrastructure". Bayesian analysis is one of the most successful family of methods for analysing data, and one now widely adopted in the statistical sciences as well as in AI technologies like machine learning. The Bayesian approach offers a number of attractive advantages over other methods: flexibility in constructing complex models from simple parts; fully coherent inferences from data; natural incorporation of prior knowledge; explicit modelling assumptions; precise reasoning of uncertainties over model order and parameters; and protection against overfitting. On the other hand, there is a general perception that they can be too slow to be practically useful on big data sets. This is because exact Bayesian computations are typically intractable, so a range of more practical approximate algorithms are needed, including variational approximations, sequential Monte Carlo (SMC) and Markov chain Monte Carlo (MCMC). MCMC methods arguably form the most popular class of Bayesian computational techniques, due to their flexibility, general applicability and asymptotic exactness. Unfortunately, MCMC methods do not scale well to big data sets, since they require many iterations to reduce Monte Carlo noise, and each iteration already involves an expensive sweep through the whole data set. In this project we propose to develop the theoretical foundations for a new class of MCMC inference procedures that can scale to billions of data items, thus unlocking the strengths of Bayesian methods for big data. The basic idea is to use a small subset of the data during each parameter update iteration of the algorithm, so that many iterations can be performed cheaply. This introduces excess stochasticity in the algorithm, which can be controlled by annealing the update step sizes towards zero as the number of iterations increases. The resulting algorithm is a cross between an MCMC and a stochastic optimization algorithm. An initial exploration of this procedure, which we call stochastic gradient Langevin dynamics (SGLD), was initiated by us recently (Welling and Teh, ICML 2011). Our proposal is to lay the mathematical foundations for understanding the theoretical properties of such stochastic MCMC algorithms, and to build on these foundations to develop more sophisticated algorithms. We aim to understand the conditions under which the algorithm is guaranteed to converge, and the type and speed of convergence. Using this understanding, we aim to develop algorithmic extensions and generalizations with better convergence properties, including preconditioning, adaptive and Riemannian methods, Hamiltonian Monte Carlo methods, Online Bayesian learning methods, and approximate methods with large step sizes. These algorithms will be empirically validated on real world problems, including large scale data analysis problems for text processing and collaborative filtering which are standard problems in machine learning, and large scale data from ID Analytics, a partner company interested in detecting identity theft and fraud.

All Research productsarrow_drop_down `<script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=ukri________::5e5b9bda7116827b0adb3983e34d0d53&type=result"></script>'); --> </script>`

For further information contact us at__helpdesk@openaire.eu__assignment_turned_in __Project__2013 - 2016 University of WarwickFunder: UKRI Project Code: EP/K012045/1Funder Contribution: 265,269 GBPExtremal Combinatorics studies relations between various parameters of discrete structures. This area experienced a remarkable growth in the last few decades. Various aspects of Computer Science and Operations Research motivated by large-scale practical problems have been relying on more and more sophisticated combinatorial techniques and have posed a whole array of new challenging problems in Discrete Mathematics. At the same time, the development of powerful and deep mathematical methods has greatly expanded the horizon of combinatorial questions that can be approached now, meeting many of the above challenges. The project will concentrate on central questions of Extremal Combinatorics. Two examples are the the Turan function that asks how local restrictions can affect the global size of a hypergraph and the Ramsey theory that investigates whether large structures contain highly ordered substructures. These problems seem to be notoriously difficult and even some basic questions remain open. The previous attempts, although not completely successful, led to a number of useful techniques and insights. Some recent developments (such as hypergraph regularity, graph limits, and flag algebras) give us new powerful tools that may be instrumental in obtaining progress on these problems. The project aims at achieving a better understanding of these areas and developing generally useful methods and techniques.

All Research productsarrow_drop_down `<script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=ukri________::7a60205d460a0fd56a7f786fecae7c7e&type=result"></script>'); --> </script>`

For further information contact us at__helpdesk@openaire.eu__assignment_turned_in __Project__2013 - 2016 University of LiverpoolFunder: UKRI Project Code: EP/K006835/1Funder Contribution: 354,296 GBPThe global market for lithium-ion batteries is expected to increase from an estimated $8bn in 2008 to $30bn by 2017, according to independent market analyst Takeshita. Lithium-air or lithium-oxygen batteries are an important technology for future energy storage because they have theoretical energy densities that are almost an order of magnitude greater than the state-of-the-art Li-ion battery. The energy storage needs of society in the long-term are likely to demand batteries for both stationary power storage to collect unwanted energy generated from wind farms and batteries to power electric vehicles. The success of these technologies underpins the UK's need to move to a lower carbon and greener economy which is less reliant on carbon dioxide generating fossil fuels. The development of lithium-oxygen batteries is being hampered by lack of understanding of the complexity of products formed on the air-cathode during reduction and oxidation. Spectroscopy is critical for identification of products and the understanding of the chemistry at the interface of electrodes. Moreover advanced in situ spectroelectrochemical techniques help us to comprehend these complex interfaces whilst under full electrochemical control. A particularly sensitive technique, surface-enhanced infrared absorption spectroscopy (SEIRAS) has not been applied to these systems. Furthermore development of in situ far-IR spectroscopy would enable us to identify lithium-oxygen compounds at these low frequencies. The goal of this proposal is therefore to further the progress of lithium-oxygen technology by fully understanding the reduction and oxidation pathways taking place within the battery and to comprehend the role of electrocatalytic surfaces.

All Research productsarrow_drop_down `<script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=ukri________::ee25a507055175f9f2e4d298a65721ae&type=result"></script>'); --> </script>`

For further information contact us at__helpdesk@openaire.eu__assignment_turned_in __Project__2013 - 2016 Cardiff UniversityFunder: UKRI Project Code: EP/K022830/1Funder Contribution: 237,733 GBPThis proposal brings together a critical mass of scientists from the Universities of Cardiff, Lancaster, Liverpool and Manchester and clinicians from the Christie, Lancaster and Liverpool NHS Hospital Trusts with the complementary experience and expertise to advance the understanding, diagnosis and treatment of cervical, oesophageal and prostate cancers. Cervical and prostate cancer are very common and the incidence of oesophageal is rising rapidly. There are cytology, biopsy and endoscopy techniques for extracting tissue from individuals who are at risk of developing these diseases. However the analysis of tissue by the standard techniques is problematic and subjective. There is clearly a national and international need to develop more accurate diagnostics for these diseases and that is a primary aim of this proposal. Experiments will be conducted on specimens from all three diseases using four different infrared based techniques which have complementary strengths and weaknesses: hyperspectral imaging, Raman spectroscopy, a new instrument to be developed by combining atomic force microscopy with infrared spectroscopy and a scanning near field microscope recently installed on the free electron laser on the ALICE accelerator at Daresbury. The latter instrument has recently been shown to have considerable potential for the study of oesophageal cancer yielding images which show the chemical composition with unprecedented spatial resolution (0.1 microns) while hyperspectral imaging and Raman spectroscopy have been shown by members of the team to provide high resolution spectra that provide insight into the nature of cervical and prostate cancers. The new instrument will be installed on the free electron laser at Daresbury and will yield images on the nanoscale. This combination of techniques will allow the team to probe the physical and chemical structure of these three cancers with unprecedented accuracy and this should reveal important information about their character and the chemical processes that underlie their malignant behavior. The results of the research will be of interest to the study of cancer generally particularly if it reveals feature common to all three cancers. The infrared techniques have considerable medical potential and to differing extents are on the verge of finding practical applications. Newer terahertz techniques also have significant potential in this field and may be cheaper to implement. Unfortunately the development of cheap portable terahertz diagnositic instruments is being impeded by the weakness of existing sources of terahertz radiation. By exploiting the terahertz radiation from the ALICE accelerator, which is seven orders of magnitude more intense that conventional sources, the team will advance the design of two different terahertz instruments and assess their performance against the more developed infrared techniques in cancer diagnosis. However before any of these techniques can be used by medical professionals it is essential that their strengths and limitations of are fully understood. This is one of the objectives of the proposal and it will be realised by comparing the results of each technique in studies of specimens from the three cancers that are the primary focus of the research. This will be accompanied by developing data basis and algorithms for the automated analysis of spectral and imaging data thus removing subjectivity from the diagnostic procedure. Finally the team will explore a new approach to monitoring the interactions between pathogens, pharmaceuticals and relevant cells or tissues at the cellular and subcellular level using the instruments deployed on the free electron laser at Daresbury together with Raman microscopy. If this is successful, it will be important in the longer term in developing new treatments for cancer and other diseases.

All Research productsarrow_drop_down `<script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=ukri________::13ea707b22a09b9729019a683eec3095&type=result"></script>'); --> </script>`

For further information contact us at__helpdesk@openaire.eu__assignment_turned_in __Project__2013 - 2016 Brunel UniversityFunder: UKRI Project Code: EP/L504671/1Funder Contribution: 637,523 GBPDevelopment of a solid state radiation hard high temperature sensor for neutron and gamma detection has many potential uses. With long term reliability suitable for use in nuclear power generation plant, high energy physics, synchrotron favilities, medical devices and national resiliience the use of solid state diamond devices is an obvious choice. Diamond eliminates the need to use helium-3 and is very radiation hard. Diamond is an expensive synthetic material and challenging to process reliably so work needs undertaking on the use of less expensive poly-crystalline diamond. Areas of innovation include precise laser cutting and plasma processing of diamond to improve the production of multi-layer devices for neutron detection. Diamond polishing needs to be improved and understood so that optimal and economic devices can be manufactured. Advanced electron micropscopy techniques, nano-mechanical and tensile testing, radiation testing as well as high temperature neutron performance and mechanical stability will be demonstrated to show how this technology can be applied successfully to future power plant designs and radiation monitoring.

All Research productsarrow_drop_down `<script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=ukri________::d0efbd7583f8548122b855742f94cd7c&type=result"></script>'); --> </script>`

For further information contact us at__helpdesk@openaire.eu__assignment_turned_in __Project__2013 - 2016 University of EdinburghFunder: UKRI Project Code: EP/I004130/2Funder Contribution: 322,634 GBPIn homotopy theory, topological spaces (i.e. shapes) are regarded as being the same if we can deform continuously from one to the other. Algebraic varieties are spaces defined by polynomial equations, often over the complex numbers; studying their homotopy theory means trying to tell which topological spaces can be deformed continuously to get algebraic varieties, or when a continuous map between algebraic varieties can be continuously deformed to a map defined by polynomials.If the polynomials defining a variety are rational numbers (i.e. fractions), this automatically gives the complex variety a group of symmetries, called the Galois group. Although these symmetries are not continuous (i.e. nearby points can be sent far apart), they preserve something called the etale topology. This is an abstract concept which looks somewhat unnatural, butbehaves well enough to preserve many of the topological features of the variety. Part of my project will involve investigating how the Galois group interacts with the etale topology. I also study algebraic varieties in finite and mixed characteristics. Finite characteristics are universes in which the rules of arithmetic are modified by choosing a prime number p, and setting it to zero. For instance, in characteristic 3 the equation 1+1+1=0 holds. In mixed characteristic, p need not be 0, but the sequence 1,p, p^2, p^3 ... converges to 0.Although classical geometry of varieties does not make sense in finite and mixed characteristics, the etale topology provides a suitable alternative, allowing us to gain much valuable insight into the behaviour of the Galois group. This is an area which I find fascinating, as much topological intuition still works in contexts far removed from real and complex geometry. Indeed, many results in complex geometry have been motivated by phenomena observed in finite characteristic.Moduli spaces parametrise classes of geometric objects, and can themselves often be given geometric structures, similar to those of algebraic varieties. This structure tends to misbehave at points parametrising objects with a lot of symmetry. To obviate this difficulty, algebraic geometers work with moduli stacks, which parametrise the symmetries as well as the objects. Sometimes the symmetries can themselves have symmetries and so on, giving rise to infinity stacks.Usually, the dimension of a moduli stack can be calculated by naively counting the degrees of freedom in defining the geometric object it parametrises. However, the space usually contains singularities (points where the space is not smooth), and regions of different dimensions. Partially inspired by ideas from theoretical physics, it has been conjectured that every moduli stack can be extended to a derived moduli stack, which would have the expected dimension, but with some of the dimensions only virtual. Extending to these virtual dimensions also removes the singularities, a phenomenon known as hidden smoothness . Different classification problems can give rise to the same moduli stack, but different derived moduli stacks. Much of my work will be to try to construct derived moduli stacks for a large class of problems. This has important applications in algebraic geometry, as there are many problems for which the moduli stacks are unmanageable, but which should become accessible using derived moduli stacks. I will also seek to investigate the geometry and behaviour of derived stacks themselves.A common thread through the various aspects of my project will be to find ways of applying powerful ideas and techniques from a branch of topology, namely homotopy theory, in contexts where they would not, at first sight, appear to be relevant.

All Research productsarrow_drop_down `<script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=ukri________::428c0415afba8cb76216f5e06b194a6e&type=result"></script>'); --> </script>`

For further information contact us at__helpdesk@openaire.eu__

#### Loading

assignment_turned_in __Project__2013 - 2016 University of EdinburghFunder: UKRI Project Code: EP/J019720/1Funder Contribution: 664,248 GBPNew ideas for carbon capture are urgently needed to combat climate change. Retro-fitting post-combustion carbon capture to existing power plants has the greatest potential to reduce CO2 emissions considering these sources make the largest contribution to CO2 emissions in the UK. Unfortunately, carbon capture methods based on existing industrial process technology for separation of CO2 from natural gas streams (i.e. amine scrubbing) would be extremely expensive if applied on the scale envisaged, as exemplified by the recent collapse of the Government's CCS project at Longannet power station. Moreover, many of the chemical absorbents used, typically amines, are corrosive and toxic and their use could generate significant amounts of hazardous waste. So, more efficient and 'greener' post-combustion CCS technologies are urgently needed if CCS is to be adopted on a global scale. Efficient separation of CO2 from flue gases requires at least the following; i) an inexpensive sorbent with high CO2 working capacity and selectivity, ii) high rates of CO2 mass transfer into and out of the sorbent, and iii) a low energy cost for sorbent regeneration. A traditional aqueous amine scrubbing process has high selectivity, but is less effective in terms of capacity, mass transfer rate, and sorbent regeneration energy penalty. Here, we propose to investigate a novel process based on the 'wetting layer absorption' (WLA) concept in which a porous material is used to support liquid-like regions of absorbing solvent, which in turn absorb the gas of interest, in this case carbon dioxide. This process, recently invented by one of the authors (MS) of this proposal at Strathclyde, is being pioneered by researchers in Scotland. Initial work involved investigation of the use of physical solvents. Here the focus is on a process involving chemical solvents, i.e. amines. This process should have a high capacity, high slectivity, and high rates of mass transfer. Another novel aspect of this work is the investigation of microwave regeneration, which could also result in much reduced costs for sorbent regeneration. Finally, the process would involve orders of magnitude reductions in solvent recycling, and could make use of much less toxic and corrosive solvents, leading to a much greener process. Ultimately, the WLA process involving chemical solvents could potentially significantly reduce the cost and environmental impact of carbon capture.

All Research productsarrow_drop_down `<script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=ukri________::42de8f16c403eaf535581452bd58031d&type=result"></script>'); --> </script>`

For further information contact us at__helpdesk@openaire.eu__assignment_turned_in __Project__2013 - 2016 University of SurreyFunder: UKRI Project Code: EP/K031805/1Funder Contribution: 221,071 GBPDuring the last twenty years mathematics and physics have significantly influenced each other and became highly entangled. Mathematical physics was always producing a wide variety of new concepts and problems that became important subjects of the pure mathematical research. The growth of gauge, gravity and string theories have made the relation between these subjects closer than ever before. An important driving force was the discovery of quantum groups and of the gauge/gravity dualities. Here the leading role was played by the the so-called AdS/CFT duality and the underlying integrable structure of it. A far-reaching concept is the effect of boundaries and the corresponding boundary conditions. They are unavoidable in almost all models of mathematical physics and are of the fundamental importance. The introduction of boundaries into the theory of quantum groups leads to a whole new class of the so-called reflection algebras. Such algebras were shown to appear in numerous mathematical models and are at the core of the integrable structure of them. Furthermore, these algebras were also shown to play a prominent role in the AdS/CFT. However a coherent framework for describing such algebras is not known, and many properties of the reflection algebras are still an open question. The goal of this research is to develop new algebraic methods and intradisciplinary connections between the axiomatic theory of algebras and the theory of quantum groups inspired by the integrable structure of the AdS/CFT, in particular by shedding more light on the effects of boundaries and different boundary configurations. The research is driven by applying algebraic objects such as the quantum affine and Yangian algebras to find elegant, exact solutions describing the models that arise from and are inspired by the gauge/gravity dualities.

All Research productsarrow_drop_down `<script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=ukri________::6d3745417e359718dbafc9dc91407007&type=result"></script>'); --> </script>`

For further information contact us at__helpdesk@openaire.eu__assignment_turned_in __Project__2013 - 2016 University of WarwickFunder: UKRI Project Code: EP/K030159/1Funder Contribution: 342,000 GBPUltrasound is used in many applications, including medical imaging, non-destructive evaluation, therapeutic ultrasound etc. In all these cases, there is usually a need for the formation of images or the creation of a focal region. Current methods for the generation of optimal acoustic fields generally rely on a linear process within the transducer. This linear transduction process influences the resultant properties in terms of spatial resolution and maximum intensity, noting that there are fundamental limits on the spatial resolution and power densities that can be achieved in such focal regions. In recent work in the area of acoustics, it has been demonstrated that a new type of acoustic signal can be generated via non-linear effects in chains of particles, which act as a kind of waveguide. These are based on the propagation of solitary waves. These have been studies at low frequencies, but this study will look at the posibility of using these new structures for use in biomedical ultrasound. Materials that support solitary waves are not used in standard ultrasonic work; little has been published on their use, despite the fact that a step change in performance may be possible. In this proposal, such waves will be generated within ultrasonic sources containing multiple solitary wave chains, at frequencies in the 500 kHz - 5 MHz range. To our knowledge, this has not been investigated before. Arrays are also possible, where each chain forms a single element. Because the chains would be primarily coupled along their length, but not laterally between each chains, issues arising from mechanical cross-coupling might be avoided. Pre-compression of each chain would alter the propagation velocity within it, so that beam-steering/focussing to be created. The propagation charaistics also change with signal amplitude, leading to the possibility of an acoustic diode. These new innovations would have applications in such areas as ultrasound-enhanced drug delivery, High Intensity Focussed Ultrasound (HIFU) for the treatment of tumours, and harmonic imaging.

All Research productsarrow_drop_down `<script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=ukri________::33431d77174fcdbf50439421dbd42bb0&type=result"></script>'); --> </script>`

For further information contact us at__helpdesk@openaire.eu__assignment_turned_in __Project__2013 - 2016 KCLFunder: UKRI Project Code: EP/J019844/1Funder Contribution: 263,385 GBPOrganic molecular monolayers at surfaces often constitute the central working component in nanotechnologies such as sensors, molecular electronics, smart coatings, organic solar cells, catalysts, medical devices, etc. A central challenge in the field is to achieve controlled creation of desired 2D molecular architectures at surfaces. Within this context, the past decade has witnessed a real and significant step-change in the 'bottom-up' self-organisation of 2D molecular assemblies at surfaces. The enormous variety and abundance of molecular structures formed via self-oeganisation has now critically tipped the argument strongly in favour of a 'bottom-up' construction strategy, which harnesses two powerful attributes of nanometer-precision (inaccessible to top-down methods) and highly parallel fabrication (impossible with atomic/molecular manipulation). Thus, bottom-up molecular assembly at surfaces holds the real possibility of becoming a dominating synthesis protocol in 21st century nanotechnologies Uniquely, the scope and versatility of these molecular architectures at 2D surfaces have been directly captured at the nanoscale via imaging with scanning probe microscopies and advanced surface spectroscopies. At present, however, the field is largely restricted to a 'make and see' approach and there is scarce understanding of any of the parameters that ultimately control molecular surface assembly. For example: (1) molecular assemblies at surfaces show highly polymorphic behaviour, and a priori control of assembly is practically non-existent; (2) little is understood of the influence and balance of the many interactions that drive molecular recognition and assembly (molecule-molecule interactions including dispersion, directional H-bonding and strong electrostatic and covalent interactions); (3) the role of surface-molecule interactions is largely uncharted even though they play a significant role in the diffusion of molecules and their subsequent assembly; (4), there is ample evidence that the kinetics of self-assembly is the major factor in determining the final structure, often driving polymorphic behaviour and leading to widely varied outcomes, depending on the conditions of formation; (5) a gamut of additional surface phenomena also also influence assembly e.g. chemical reactions between molecules, thermally activated internal degrees of freedom of molecules, surface reconstructions and co-assembly via coordinating surface atoms. The main objective of this project is to advance from experimental phenomena-reporting to knowledge-based design, and its central goal is to identify the role played by thermodynamic, entropic, kinetic and chemical factors in dictating molecular organisation at surfaces under given experimental conditions. To address this challenge requires a two-pronged approach in which ambitious and comprehensive theory development is undertaken alongside powerful imaging and spectroscopic tools applied to the same systems. This synergy of experiment and theory is absolutely essential to develop a fundamental understanding, which would enable a roadmap for controlled and engineered self-assembly at surfaces to be proposed that would, ultimately, allow one to 'dial up' a required structure at will. Four important and qualitatively different classes of assembly at surfaces will be studied: Molecular Self-Assembly; Hierarchical Self-Assembly; Metal-Organic Self Assembly; and, on-surface Covalent Assembly.

All Research productsarrow_drop_down `<script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=ukri________::32219bc2fb44515320a606000bba0322&type=result"></script>'); --> </script>`

For further information contact us at__helpdesk@openaire.eu__assignment_turned_in __Project__2013 - 2016 University of OxfordFunder: UKRI Project Code: EP/K009850/1Funder Contribution: 158,970 GBPWe are in the midst of an information revolution, where advances in science and technology, as well as the day-to-day operation of successful organisations and businesses, are increasingly reliant on the analyses of data. Driving these advances is a deluge of data, which is far outstripping the increase in computational power available. The importance of managing, analysing, and deriving useful understanding from such large scale data is highlighted by high-profile reports by McKinsey and The Economist as well as other outlets, and by the EPSRC's recent ICT priority of "Towards an Intelligent Information Infrastructure". Bayesian analysis is one of the most successful family of methods for analysing data, and one now widely adopted in the statistical sciences as well as in AI technologies like machine learning. The Bayesian approach offers a number of attractive advantages over other methods: flexibility in constructing complex models from simple parts; fully coherent inferences from data; natural incorporation of prior knowledge; explicit modelling assumptions; precise reasoning of uncertainties over model order and parameters; and protection against overfitting. On the other hand, there is a general perception that they can be too slow to be practically useful on big data sets. This is because exact Bayesian computations are typically intractable, so a range of more practical approximate algorithms are needed, including variational approximations, sequential Monte Carlo (SMC) and Markov chain Monte Carlo (MCMC). MCMC methods arguably form the most popular class of Bayesian computational techniques, due to their flexibility, general applicability and asymptotic exactness. Unfortunately, MCMC methods do not scale well to big data sets, since they require many iterations to reduce Monte Carlo noise, and each iteration already involves an expensive sweep through the whole data set. In this project we propose to develop the theoretical foundations for a new class of MCMC inference procedures that can scale to billions of data items, thus unlocking the strengths of Bayesian methods for big data. The basic idea is to use a small subset of the data during each parameter update iteration of the algorithm, so that many iterations can be performed cheaply. This introduces excess stochasticity in the algorithm, which can be controlled by annealing the update step sizes towards zero as the number of iterations increases. The resulting algorithm is a cross between an MCMC and a stochastic optimization algorithm. An initial exploration of this procedure, which we call stochastic gradient Langevin dynamics (SGLD), was initiated by us recently (Welling and Teh, ICML 2011). Our proposal is to lay the mathematical foundations for understanding the theoretical properties of such stochastic MCMC algorithms, and to build on these foundations to develop more sophisticated algorithms. We aim to understand the conditions under which the algorithm is guaranteed to converge, and the type and speed of convergence. Using this understanding, we aim to develop algorithmic extensions and generalizations with better convergence properties, including preconditioning, adaptive and Riemannian methods, Hamiltonian Monte Carlo methods, Online Bayesian learning methods, and approximate methods with large step sizes. These algorithms will be empirically validated on real world problems, including large scale data analysis problems for text processing and collaborative filtering which are standard problems in machine learning, and large scale data from ID Analytics, a partner company interested in detecting identity theft and fraud.

All Research productsarrow_drop_down `<script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=ukri________::5e5b9bda7116827b0adb3983e34d0d53&type=result"></script>'); --> </script>`

For further information contact us at__helpdesk@openaire.eu__assignment_turned_in __Project__2013 - 2016 University of WarwickFunder: UKRI Project Code: EP/K012045/1Funder Contribution: 265,269 GBPExtremal Combinatorics studies relations between various parameters of discrete structures. This area experienced a remarkable growth in the last few decades. Various aspects of Computer Science and Operations Research motivated by large-scale practical problems have been relying on more and more sophisticated combinatorial techniques and have posed a whole array of new challenging problems in Discrete Mathematics. At the same time, the development of powerful and deep mathematical methods has greatly expanded the horizon of combinatorial questions that can be approached now, meeting many of the above challenges. The project will concentrate on central questions of Extremal Combinatorics. Two examples are the the Turan function that asks how local restrictions can affect the global size of a hypergraph and the Ramsey theory that investigates whether large structures contain highly ordered substructures. These problems seem to be notoriously difficult and even some basic questions remain open. The previous attempts, although not completely successful, led to a number of useful techniques and insights. Some recent developments (such as hypergraph regularity, graph limits, and flag algebras) give us new powerful tools that may be instrumental in obtaining progress on these problems. The project aims at achieving a better understanding of these areas and developing generally useful methods and techniques.

All Research productsarrow_drop_down `<script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=ukri________::7a60205d460a0fd56a7f786fecae7c7e&type=result"></script>'); --> </script>`

For further information contact us at__helpdesk@openaire.eu__assignment_turned_in __Project__2013 - 2016 University of LiverpoolFunder: UKRI Project Code: EP/K006835/1Funder Contribution: 354,296 GBPThe global market for lithium-ion batteries is expected to increase from an estimated $8bn in 2008 to $30bn by 2017, according to independent market analyst Takeshita. Lithium-air or lithium-oxygen batteries are an important technology for future energy storage because they have theoretical energy densities that are almost an order of magnitude greater than the state-of-the-art Li-ion battery. The energy storage needs of society in the long-term are likely to demand batteries for both stationary power storage to collect unwanted energy generated from wind farms and batteries to power electric vehicles. The success of these technologies underpins the UK's need to move to a lower carbon and greener economy which is less reliant on carbon dioxide generating fossil fuels. The development of lithium-oxygen batteries is being hampered by lack of understanding of the complexity of products formed on the air-cathode during reduction and oxidation. Spectroscopy is critical for identification of products and the understanding of the chemistry at the interface of electrodes. Moreover advanced in situ spectroelectrochemical techniques help us to comprehend these complex interfaces whilst under full electrochemical control. A particularly sensitive technique, surface-enhanced infrared absorption spectroscopy (SEIRAS) has not been applied to these systems. Furthermore development of in situ far-IR spectroscopy would enable us to identify lithium-oxygen compounds at these low frequencies. The goal of this proposal is therefore to further the progress of lithium-oxygen technology by fully understanding the reduction and oxidation pathways taking place within the battery and to comprehend the role of electrocatalytic surfaces.

All Research productsarrow_drop_down `<script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=ukri________::ee25a507055175f9f2e4d298a65721ae&type=result"></script>'); --> </script>`

For further information contact us at__helpdesk@openaire.eu__assignment_turned_in __Project__2013 - 2016 Cardiff UniversityFunder: UKRI Project Code: EP/K022830/1Funder Contribution: 237,733 GBPThis proposal brings together a critical mass of scientists from the Universities of Cardiff, Lancaster, Liverpool and Manchester and clinicians from the Christie, Lancaster and Liverpool NHS Hospital Trusts with the complementary experience and expertise to advance the understanding, diagnosis and treatment of cervical, oesophageal and prostate cancers. Cervical and prostate cancer are very common and the incidence of oesophageal is rising rapidly. There are cytology, biopsy and endoscopy techniques for extracting tissue from individuals who are at risk of developing these diseases. However the analysis of tissue by the standard techniques is problematic and subjective. There is clearly a national and international need to develop more accurate diagnostics for these diseases and that is a primary aim of this proposal. Experiments will be conducted on specimens from all three diseases using four different infrared based techniques which have complementary strengths and weaknesses: hyperspectral imaging, Raman spectroscopy, a new instrument to be developed by combining atomic force microscopy with infrared spectroscopy and a scanning near field microscope recently installed on the free electron laser on the ALICE accelerator at Daresbury. The latter instrument has recently been shown to have considerable potential for the study of oesophageal cancer yielding images which show the chemical composition with unprecedented spatial resolution (0.1 microns) while hyperspectral imaging and Raman spectroscopy have been shown by members of the team to provide high resolution spectra that provide insight into the nature of cervical and prostate cancers. The new instrument will be installed on the free electron laser at Daresbury and will yield images on the nanoscale. This combination of techniques will allow the team to probe the physical and chemical structure of these three cancers with unprecedented accuracy and this should reveal important information about their character and the chemical processes that underlie their malignant behavior. The results of the research will be of interest to the study of cancer generally particularly if it reveals feature common to all three cancers. The infrared techniques have considerable medical potential and to differing extents are on the verge of finding practical applications. Newer terahertz techniques also have significant potential in this field and may be cheaper to implement. Unfortunately the development of cheap portable terahertz diagnositic instruments is being impeded by the weakness of existing sources of terahertz radiation. By exploiting the terahertz radiation from the ALICE accelerator, which is seven orders of magnitude more intense that conventional sources, the team will advance the design of two different terahertz instruments and assess their performance against the more developed infrared techniques in cancer diagnosis. However before any of these techniques can be used by medical professionals it is essential that their strengths and limitations of are fully understood. This is one of the objectives of the proposal and it will be realised by comparing the results of each technique in studies of specimens from the three cancers that are the primary focus of the research. This will be accompanied by developing data basis and algorithms for the automated analysis of spectral and imaging data thus removing subjectivity from the diagnostic procedure. Finally the team will explore a new approach to monitoring the interactions between pathogens, pharmaceuticals and relevant cells or tissues at the cellular and subcellular level using the instruments deployed on the free electron laser at Daresbury together with Raman microscopy. If this is successful, it will be important in the longer term in developing new treatments for cancer and other diseases.

All Research productsarrow_drop_down `<script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=ukri________::13ea707b22a09b9729019a683eec3095&type=result"></script>'); --> </script>`

For further information contact us at__helpdesk@openaire.eu__assignment_turned_in __Project__2013 - 2016 Brunel UniversityFunder: UKRI Project Code: EP/L504671/1Funder Contribution: 637,523 GBPDevelopment of a solid state radiation hard high temperature sensor for neutron and gamma detection has many potential uses. With long term reliability suitable for use in nuclear power generation plant, high energy physics, synchrotron favilities, medical devices and national resiliience the use of solid state diamond devices is an obvious choice. Diamond eliminates the need to use helium-3 and is very radiation hard. Diamond is an expensive synthetic material and challenging to process reliably so work needs undertaking on the use of less expensive poly-crystalline diamond. Areas of innovation include precise laser cutting and plasma processing of diamond to improve the production of multi-layer devices for neutron detection. Diamond polishing needs to be improved and understood so that optimal and economic devices can be manufactured. Advanced electron micropscopy techniques, nano-mechanical and tensile testing, radiation testing as well as high temperature neutron performance and mechanical stability will be demonstrated to show how this technology can be applied successfully to future power plant designs and radiation monitoring.

All Research productsarrow_drop_down `<script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=ukri________::d0efbd7583f8548122b855742f94cd7c&type=result"></script>'); --> </script>`

For further information contact us at__helpdesk@openaire.eu__assignment_turned_in __Project__2013 - 2016 University of EdinburghFunder: UKRI Project Code: EP/I004130/2Funder Contribution: 322,634 GBPIn homotopy theory, topological spaces (i.e. shapes) are regarded as being the same if we can deform continuously from one to the other. Algebraic varieties are spaces defined by polynomial equations, often over the complex numbers; studying their homotopy theory means trying to tell which topological spaces can be deformed continuously to get algebraic varieties, or when a continuous map between algebraic varieties can be continuously deformed to a map defined by polynomials.If the polynomials defining a variety are rational numbers (i.e. fractions), this automatically gives the complex variety a group of symmetries, called the Galois group. Although these symmetries are not continuous (i.e. nearby points can be sent far apart), they preserve something called the etale topology. This is an abstract concept which looks somewhat unnatural, butbehaves well enough to preserve many of the topological features of the variety. Part of my project will involve investigating how the Galois group interacts with the etale topology. I also study algebraic varieties in finite and mixed characteristics. Finite characteristics are universes in which the rules of arithmetic are modified by choosing a prime number p, and setting it to zero. For instance, in characteristic 3 the equation 1+1+1=0 holds. In mixed characteristic, p need not be 0, but the sequence 1,p, p^2, p^3 ... converges to 0.Although classical geometry of varieties does not make sense in finite and mixed characteristics, the etale topology provides a suitable alternative, allowing us to gain much valuable insight into the behaviour of the Galois group. This is an area which I find fascinating, as much topological intuition still works in contexts far removed from real and complex geometry. Indeed, many results in complex geometry have been motivated by phenomena observed in finite characteristic.Moduli spaces parametrise classes of geometric objects, and can themselves often be given geometric structures, similar to those of algebraic varieties. This structure tends to misbehave at points parametrising objects with a lot of symmetry. To obviate this difficulty, algebraic geometers work with moduli stacks, which parametrise the symmetries as well as the objects. Sometimes the symmetries can themselves have symmetries and so on, giving rise to infinity stacks.Usually, the dimension of a moduli stack can be calculated by naively counting the degrees of freedom in defining the geometric object it parametrises. However, the space usually contains singularities (points where the space is not smooth), and regions of different dimensions. Partially inspired by ideas from theoretical physics, it has been conjectured that every moduli stack can be extended to a derived moduli stack, which would have the expected dimension, but with some of the dimensions only virtual. Extending to these virtual dimensions also removes the singularities, a phenomenon known as hidden smoothness . Different classification problems can give rise to the same moduli stack, but different derived moduli stacks. Much of my work will be to try to construct derived moduli stacks for a large class of problems. This has important applications in algebraic geometry, as there are many problems for which the moduli stacks are unmanageable, but which should become accessible using derived moduli stacks. I will also seek to investigate the geometry and behaviour of derived stacks themselves.A common thread through the various aspects of my project will be to find ways of applying powerful ideas and techniques from a branch of topology, namely homotopy theory, in contexts where they would not, at first sight, appear to be relevant.

All Research productsarrow_drop_down `<script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=ukri________::428c0415afba8cb76216f5e06b194a6e&type=result"></script>'); --> </script>`

For further information contact us at__helpdesk@openaire.eu__