Filters
Clear AllLoading
assignment_turned_in Project2008 - 2011 LANL, Imperial College London, NSUFunder: UKRI Project Code: EP/F00947X/1Funder Contribution: 298,769 GBPThe objective of this proposal is to improve imaging techniques which exploit the interaction of waves with matter to reconstruct the physical properties of an object. To date these techniques have been limited by the tradeoff between resolution and imaging depth. While long wavelengths can penetrate deep into a medium, the classical diffraction limit precludes the possibility of observing subwavelength structures. Over the past twenty years, near-field microscopy has demonstrated that the diffraction limit can be broken by bringing a probing sensor within one wavelength distance from the surface of the object to be imaged. Now, the scope of near field microscopy has been extended to the reconstruction of subwavelength structures from measurements performed in the far-field, this approach having a much wider range of applications since often the structure to be imaged is not directly accessible. The key to subwavelength resolution lies in the physical models employed to describe wave scattering. Conventional imaging methods use the Born approximation which does not take into account the distortion of the probing wave as it travels through the medium to be imaged, so neglecting what is known as multiple scattering. On the other hand, multiple scattering is the key mechanism which can encode subwavelength information in the far-field, thus leading to a potentially unlimited resolution. New experimental evidence has shown that a resolution better than a quarter of the wavelength can be achieved for an object more than 70 wavelengths away from the probing sensors. This preliminary work has established the fundamental principle that subwavelength resolution from far-field measurements is possible as long as the Born approximation is abandoned and more accurate models for the wave-matter interaction are adopted. The aim of this proposal is to pursue this new and exciting idea and to focus on more specific applications such as medical diagnostics and geophysical imaging.
Do the share buttons not appear? Please make sure, any blocking addon is disabled, and then reload the page.All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=ukri________::23202c4c45e7bfe5be79940089a22441&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.euassignment_turned_in Project2008 - 2011 University of EssexFunder: UKRI Project Code: EP/F033818/1Funder Contribution: 364,770 GBPThe keyboard is a piece of plastic with lots of switches, which provides us with a reliable, but very unnatural form of input. The mouse is slightly less primitive. Still, it is only an electro-mechanical transducer of musculoskeletal movement. Both have been with us for many years and are still the best computer interfaces we have at the moment, yet they are unusable for people with severe musculoskeletal disorders and are themselves known causes of work-related upper-limb and back disorders: both hugely widespread problems for the UK's workforce.Wouldn't it be nice some day to be able to dispose of them and replace them with Brain-Computer Interfaces (BCIs) capable of directly interpreting the desires and intentions of computer users?This adventurous proposal aims to carry out an innovative and ambitious interdisciplinary research programme at the edge of Computer Science, Biomedical Engineering, Neuroscience and Psychology aimed at developing precisely these devices with a novel powerful BCI approach recently developed by the applicants.BCI has been a dream of researchers for many years, but developments have been slow and, with rare exceptions, BCI is still effectively a curiosity that can be used only in the lab. So, what's different about this project?In very recent research, we were able to develop a prototype BCI mouse capable of full 2-D motion control (a rarity in the BCI world), which uniquely can be used by any person without any prior training within minutes. This was possible thanks to our taking a completely innovative approach to BCI. Previous BCI designs were based on the paradigm of observing EEG signals looking for specific features or waves, manipulating them and then making a yes-or-no decision as to whether such features or waves were present. Contrary to this design wisdom, we completely dispose of the decision step and allow brain waves to directly control the computer via simple analogue transformations. Furthermore, we only partially design the system, leaving the completion and customisation of the design for each specific user to an evolutionary algorithm. This, thanks to an artificial form of Darwinian evolution inside the computer, performs the difficult tasks of selecting the best EEG channels, waves and analogue manipulations. Using these same two ingredients (analogue approach and evolutionary design) and starting from our successful experimental BCI mouse, this project specifically aims at developing brain-computer interfaces which are sufficiently robust, flexible and cheap to leave the lab and that can start making a serious impact in the real world.To maximise performance, preliminary work will determine the optimal visual presentation conditions that minimise cognitive load, perceptual errors and target-distractors interference.
Do the share buttons not appear? Please make sure, any blocking addon is disabled, and then reload the page.All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=ukri________::a5d74053180e3ab4096483e18a055434&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.euassignment_turned_in Project2008 - 2011 University of YorkFunder: UKRI Project Code: EP/F027028/1Funder Contribution: 228,108 GBPDiophantine approximation is a branch of number theory that can loosely be described as a quantitative analysis of the property that every real number can be approximated by a rational number arbitrarily closely. The theory dates back to the ancient Greeks and Chinese who used good rational approximations to the number pi (3.14159...) in order to accurately predict the position of planets and stars.The metric theory of Diophantine approximation is the study of the approximation properties of real numbers by rationals from a measure theoretic (probabilistic) point of view. The central theme is to determine whether a given approximation property holds everywhere except on an exceptional set of measure zero. In his pioneering work of 1924, Khintchine established an elegant probabilistic criterion (a `zero-one' law) in terms of Lebesgue measure for a real number to be approximable by rationals with an arbitrary decreasing (monotonic) error. The error is a function of the size of the denominators of the rational approximates and decreases as the size of the denominators increases. The monotonicity assumption is crucial since the criterion is false otherwise. Under the natural assumption that the rational approximates are reduced (i.e. in their lowest form so that the error of approximation at a rational point is determined uniquely), the Duffin-Schaeffer conjecture (1941) provides the appropriate expected statement without the monotonicity assumption. It represents one of the most famous unsolved problems in number theory. A major aim is to make significant contributions to this key conjecture by exploiting the recent `martingale' approach developed by Haynes (the named Research Assistant) and Vaaler. Furthermore, a more general form of the conjecture in which Lebesgue measure is replaced by Hausdorff measure (a fractal quantity) will be investigated. A major outcome will be the Duffin-Schaeffer conjecture for measures close to Lebesgue measure. The importance of the Duffin-Schaeffer conjecture is unquestionable. However, it does change the underlying nature of the problem considered by Khintchine in that the rational approximates are reduced. In 1971, Catlin stated a conjecture for the unconstrained problem in which the rationals are not assumed to be reduced. Catlin claimed that his conjecture was equivalent to the Duffin-Schaeffer conjecture. However, his proof contained a serious flaw and the claim remains an interesting problem in its own right. In higher dimensions, the approximation of arbitrary points in n-dimensional space by rational points (simultaneous approximation) or rational hyperplanes (dual approximation) is the natural generalisation of the one-dimensional theory. Considering a system of linear forms unifies both forms and naturally gives rise to the linear forms theory. The metric theory of Diophantine approximation is complete for simultaneous approximation in dimension greater than one. The analogues of Khintchine's criterion without any monotonicity assumption (i.e. the simultaneous Catlin conjecture) and the Duffin-Schaeffer conjecture have both been established as well as the more precise and delicate Hausdorff measure theoretic statements. However, the dual and more generally the linear forms theory are far from complete. In this proposal the linear forms analogues of the Duffin-Schaeffer and Catlin conjectures are precisely formulated. A principle goal is to establish these conjectures in dimension greater than one. A novel idea is to develop a `slicing' technique that reduces a linear forms problem to a well understood simultaneous problem. The major outcome will be a unified linear forms theory in Euclidean space.
Do the share buttons not appear? Please make sure, any blocking addon is disabled, and then reload the page.All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=ukri________::8c482aae1e161f7d223c538c79202635&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.euassignment_turned_in Project2008 - 2011 University of EdinburghFunder: UKRI Project Code: DT/F007744/1Funder Contribution: 719,422 GBPCarbon dioxide (CO2) is considered to be a greenhouse gas. The concentration of CO2 in the earth atmosphere is an important control on earth surface temperature, and hence climate. CO2 dissolution in the oceans is also being recognised as an important factor in making surface seawater unusually acid / this severely affects ecosystems and species from algae to fish and whales. Increased CO2 in the atmosphere is recognised as being partly caused by burning of fossil fuels, such as coal and gas, in power stations. Carbon Capture and Storage is a suite of technologies which enables CO2 to be captured at power stations, liquefied by increasing the pressure, transported by a pipe, and injected deep underground in to pore space of deeply buried sedimentary rocks such as sandstones. This can effectively remove CO2 from the power cycle of fossil fuel use, and store the CO2 for tens of thousands of years, which enables the earth atmosphere to return to normal. Because of the very large CO2 volumes involved, it is not possible to build surface stores. Because of the acid effects of CO2, it is not possible to inject CO2 into seawater. By contrast, the Intergovernmental Panel on Climate Change (IPCC) have calculated that more than 25% of world CO2 emissions could be stored by geological CCS. This could be a vital technology for the world's future. There is a great deal of interest worldwide in CCS and, because of the offshore oil industry, the North Sea is one of the world's prime areas for CCS to be rapidly developed. However, there are only three full-scale projects at present in the world. For UK power generating companies to become commercially interested the chain of technologies must be both demonstrated to work reliably, and must be capable of cost-effective development. This project is trying to identify aquifer sites deep underground which are close to power plant in the U.K., where CO2 can be safely stored, but sites are quicker and cheaper to develop than offshore in the North Sea. This can enable power generating companies to develop CCS over a period of years, on a medium scale, and learn to conduct the industrial operation. If this project is successful, it could lead to take up of CCS in the U.K. 10 or 15 years earlier than waiting for an infrastructure of large North Sea pipelines to be developed for CO2. When those pipes become available, UK power companies will be completely ready to connect power plant to store CO2 in large redundant hydrocarbon fields offshore. This could save many tens of million tons CO2 per year being emitted into the atmosphere from the U.K., and place the U.K. in the forefront of carbon reduction nations. The universities and companies involved in this 2.3M consortium are all experienced in investigating the deep subsurface for oil and gas production. Edinburgh, Heriot-Watt and BGS already have 1.6M from the Scottish Executive to establish the UK's largest research grouping to investigate CO2 storage. This expertise will be transferred to exploring for CO2 disposal sites. Using the information held by the British Geological Survey, maps will be made of the subsurface deep beneath England, and deep beneath the Forth estuary. Heriot-Watt university will assess the potential chemical reactions of CO2 with rock, and how much CO2 can be injected. Electricity generators, led by Scottish Power, will make engineering designs for modified power stations to supply CO2. Schlumberger and Marathon Oil, will assess the subsurface technology required for safe and reliable injection and monitoring. The University of Edinburgh will make computer simulations to determine if CO2 will leak deep below ground, and will assess how specific site is storage sites will perform to safely retain CO2. Amec will evaluate transport of CO2 by pipe. Tyndall will investigate the public attitudes at the candidate storage sites.
Do the share buttons not appear? Please make sure, any blocking addon is disabled, and then reload the page.All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=ukri________::2d18b339e4895c7f973339e6314c4b89&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.euassignment_turned_in Project2008 - 2011 University of LeedsFunder: UKRI Project Code: EP/G028427/1Funder Contribution: 98,178 GBPAtom interference has been applied in many pioneering experiments ranging from fundamental studies to precision measurements. The techniques of laser cooling and trapping have allowed the realization of bright sources of macroscopic matter waves. This project is part of a EUROCORES Collaborative Research Project (within the EuroQUASAR programme coordinated by ESF) whose goal is to build upon this expertise and use interference of quantum degenerate macroscopic matter waves for a new generation of precision measurements. Two sets of applications are envisioned: (1) Precision determination of fundamental constants and inertial forces in free space, and (2) Interferometers for trapped atoms close to the surface as a microscope for highly sensitive measurements of surface forces on the micron length scale. To achieve the ultimate sensitivity we will engineer the interactions between the atoms and create non-classical matter-wave quantum states to beat the standard quantum measurement limit. Ultracold degenerate quantum gases with their inherent coherence and narrow spread in space and momentum promise to be the ideal starting point for precision matter wave interference experiments, similar to lasers for light optics. In contrast to light, atoms interact with each other, and the physics of degenerate quantum gases is in many cases dominated by these interactions. This can be an advantage, allowing tricks from non-linear optics like squeezing to boost sensitivity, and a disadvantage, resulting in additional dephasing due to uncontrolled collisional phase shifts. We will exploit recent advances in controlling these interactions by Feshbach resonances to pick out the advantages and to suppress the disadvantages caused by the interactions. Much of the planned work will be very fundamental and exploratory as many of the capabilities together with possible limitations have yet to be investigated.The collaborative research project entitled Quantum-Degenerate Gases for Precision Measurement (QuDeGPM) focuses European efforts on precision measurements with quantum degenerate gases and in particular with Bose-Einstein condensates (BEC). The project is organized along the main objectives of (i) performing precision atom interferometry with quantum degenerate gases, (ii) using quantum degenerate gases for precision surface probing, and (iii) exploring, realizing, and testing novel measurement schemes with non-classical matter wave states. The project in Durham focuses on the use of matter-waves with tunable interactions to probe atom surface interactions. Specifically two experimental thrusts are planned. The first uses bright matter-wave solitons as the basis for a new form of matter-wave interferometry. This work connects to an existing project which began in January 2008 (EPRSC grant EP/F002068/1). The second thrust exploits condensates where the interactions are tuned to zero to study long-lived Bloch oscillations in a 1D lattice in the vicinity of a solid surface.
Do the share buttons not appear? Please make sure, any blocking addon is disabled, and then reload the page.All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=ukri________::9ef4d05736ca19a250a186b79cce8404&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.euassignment_turned_in Project2008 - 2011 Silistix Ltd, University of Salford, ST MicroelectronicsFunder: UKRI Project Code: EP/E06065X/1Funder Contribution: 319,957 GBPThe purpose of this work is to investigate an on-chip network fabric that will enable future reconfigurable computing systems integrating tens or hundreds of processing tiles implementing embedded microprocessors, intellectual property cores, reconfigurable fabrics, dedicated local memories and DSP functionality. The reconfigurable NoC fabric will direct the effective communication and exchange of data among the multiple processing tiles and enable fault-tolerance and very high communication bandwidths with low-latency and low energy consumption. The processing tiles will morph their functionality and operation point based on the application demands.
Do the share buttons not appear? Please make sure, any blocking addon is disabled, and then reload the page.All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=ukri________::4ef31012e0ae72a0937249ed5bb9fa9d&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.euassignment_turned_in Project2008 - 2011 University of Salford, UWE, Mas Networks LtdFunder: UKRI Project Code: EP/F069170/1Funder Contribution: 339,493 GBPMasonry arch bridges represent almost half of the European bridge stock, most of which are over 100 years old and carrying traffic far heavier than they have been designed for by their builders. In order to ensure public safety and maintain the bridge network, estimating the bridges' safe working loads and remaining service life are becoming pressing issues for bridge owners. The proposed project considers some of the main aspects in relation to masonry arch bridges, such as inspection, monitoring, assessment and material testing. Masonry arch bridges vary greatly in material composition, include large volumes of materials and often have unknown internal structures, that make inspection and monitoring of the bridge's overall condition increasingly problematic. Although there is a range of NDT techniques currently available, only few of these techniques have been adopted for masonry arch bridges and are able to supply useful information on the bridges' structural condition. For that reason, simple empirical procedures (such as visual observation, hammer tapping, measuring deflection at individual locations) remain till date the most widely used inspection procedures for masonry arch bridges internationally. During preliminary studies, acoustic emission technique has been found to supply highly valuable information on the condition and damage propagation of masonry arch bridges. Within the current project, experience and guidance will be provided to enable wider adaptation of the acoustic emission technique for improved condition assessment and monitoring for masonry arch bridges in the field. Assessment of masonry arch bridges generally relies on a number of subjective factors relating to the material properties, structural condition and on a limited volume of material test data for calibration. Also, most assessment techniques consider the bridge's ultimate load capacity rather than its safe/fatigue load capacity and give no indication of the remaining service life. The recently developed 'SMART' assessment method offers the possibility for the first time for estimating the bridge's safe working limits and remaining service life. While the basics for the 'SMART' assessment method have so far been developed, material test data are now needed for further development of the technique. Providing good quality material test data for the 'SMART' and other assessment methods forms the basic outset for the proposed project. A series of laboratory tests are proposed on over 500 small-scale masonry samples under fatigue loading to represent the most common forms of failure modes and material qualities for masonry arch bridges. It is however recognised, that due to the high variability in masonry properties and loading conditions, the proposed tests will only present a selection of the wide range of cases in praxis. In order to enable future tests to be incorporated into the test series, a methodology for testing, analysing and structuring test data will also be developed.The outcome of the project will a) enable bridge owners responsible for the railway, highway and waterway networks to develop asset management for their masonry arch bridge stock, b) provide tools for bridge engineers and consultants to gain deeper insight into the structural condition of masonry arch bridges, estimate the safe loading limits and residual life and c) enable researchers to develop assessment methods. The project will be undertaken in close liaison with Network Rail, Highways Agency and the European railway authorities to ensure that the needs of bridge owners are met and project outcomes are incorporated into European praxis.
Do the share buttons not appear? Please make sure, any blocking addon is disabled, and then reload the page.All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=ukri________::01dfefa17c6e9f885a188bd21407855c&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.euassignment_turned_in Project2008 - 2011 Imperial College LondonFunder: UKRI Project Code: EP/F016271/1Funder Contribution: 98,621 GBPThe past fifteen years has seen considerable research into the coupling of superconductivity and magnetism. These two effects are both mediated by coupling between electrons, but ferromagnetism leads to the parallel alignment of spins while conventional (so called spin-singlet) superconductivity requires anti-parallel spin alignment. As a result the coupling of superconductivity into ferromagnets is generally much weaker than the coupling into non-magnetic metals (the so-called proximity effect). However, at very short-range (a few nanometres) the coupling between superconductivity and ferromagnetism at the interface between the two materials results in complex behaviour which is distinct from that of either material. Most notably, the pairs of electrons which are responsible for superconductivity have a rapidly oscillating phase in the ferromagnet which can lead to negative rather than positive supercurrents appearing in devices in which a thin ferromagnetic barrier separates two superconductors. Devices based on this effect are currently being developed for quantum computation. More controversially, a few very recent experiments have detected a much longer-ranged proximity effect in which superconductivity can penetrate a ferromagnet over distances of hundreds on nanometres. This effect seems to be confirmation of theoretical predications that if the magnetism is inhomogeneous (i.e. all the spins do not point in a single direction) or the electrons are 100% spin polarised then a so-called spin-triplet state of superconductivity should appear. The aim of our proposed project is to investigate carefully the conditions required for the formation of this spin-triplet state and to understand how to control it so that potential applications can be developed. In particular we will look at classes of ferromagnet which have a spiral rather than linear magnetic order, we will also grow artificial magnetic structures in which such spirals can be changed by applying a magnetic field and we will explore how the presence of magnetic domain walls (which are regions in which the magnetism changes direction in a material) affects the superconducting properties.
Do the share buttons not appear? Please make sure, any blocking addon is disabled, and then reload the page.All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=ukri________::ac5b28b1b1520c23030cc299463fcf46&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.euassignment_turned_in Project2008 - 2011 University of CambridgeFunder: UKRI Project Code: EP/F019297/1Funder Contribution: 188,517 GBPIn principle, photovoltaic devices could meet all our energy requirements in a sustainable way, but at the moment the capital expense of conventional photovoltaics is too great to be competitive, and the volume in which they can be produced is much too small to make a serious dent in our electricity generating needs. Their relatively high manufacturing cost and the difficulty of scaling the manufacturing process is an intrinsic feature of their energy-intensive fabrication process. In contrast, non-conventional PVs based on organic semiconductors can be processed from solution using high-volume roll-to-roll printing technologies, offering the possibility of large area devices being fabricated on flexible substrates at very low cost. Unfortunately at present, organic PV devices are characterized by prohibitively low external power efficiencies (< 6%). Closing the gap in efficiency between organic and inorganic PV devices is a significant challenge / one which will require a full microscopic understanding of the processes that currently limit organic PV efficiency. The most promising organic PV devices are currently based on solution-cast blends of conjugated polymers doped with fullerene derivatives. Relatively little is however known regarding the role of the self-assembled nanoscale morphology of such systems on their operational efficiency. In this proposal, we seek to develop a comprehensive mechanistic understanding of the self-assembly processes by which nanoscale structure arises within such PV applicable materials. In particular we propose to study the evolution of nanoscale phase-separation during film casting using X-ray scattering. We will also utilize a range of complementary microscopy techniques ranging from environmental scanning electron microscopy, to time-resolved near field microscopy. The combination of such techniques will permit us to develop a complete picture of film structure from molecular to microscopic length-scales. Our proposed project draws together some of the UK's leading polymer scientists and technologists, with our goal being to significantly advance the understanding of the processes that limit organic PV device performance.
Do the share buttons not appear? Please make sure, any blocking addon is disabled, and then reload the page.All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=ukri________::b17676ed0b6304dd59c104047864a686&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.euassignment_turned_in Project2008 - 2011 ROLLS-ROYCE PLC, ANSYS, University of SouthamptonFunder: UKRI Project Code: EP/F006802/1Funder Contribution: 346,237 GBPUncertainty is ubiquitous in the mathematical characterisation of engineered and natural systems. In many structural engineering applications, a deterministic characterisation of the response may not be realistic because of uncertainty in the material constitutive laws, operating conditions, geometric variability, unmodelled behaviour, etc. Ignoring these sources of uncertainties or attempting to lump them into a factor of safety is no longer widely considered to be a rational approach, especially for high-performance and safety-critical applications. It is now increasingly acknowledged that modern computational methods must explicitly account for uncertainty and produce a certificate of response variability alongside nominal predictions. Advances in this area are key to bringing closer the promise of computational models as reliable surrogates of reality. This capability will potentially allow significant reductions in the engineering product development cycle due to decreased reliance on extensive experimental testing programs and enable the design of systems that perform robustly in the face of uncertainty. The proposed investigation will address this important research problem and deliver convergent computational methods and efficient software implementations that are orders of magnitude faster than direct Monte-Carlo simulation for predicting the response of structural systems in the presence of uncertainty. This work will draw upon developments in stochastic subspace projection theory which have recently emerged as a highly efficient and accurate alternative to existing techniques in computational stochastic mechanics. The overall objectives of this project include: (1) formulation of convergent stochastic projection schemes for predicting the static and (low and medium frequency) dynamic response statistics of large-scale stochastic structural systems. (2) design and implementation of a state-of-the-art parallel software framework that leverages existing deterministic finite element codes for stochastic analysis of complex structural systems, and (3) laboratory and computer experiments to validate the methods developed. The methods to be developed will find applications to a wide range of structural problems that require efficient and accurate predictions of performance and safety in the presence of uncertainty. This is a crucial first step towards rational design and control strategies that can meet stringent performance targets and simultaneously ensure system robustness. Progress in this area would also be of benefit to many other fields in engineering and the physical sciences where there is a pressing need to quantify uncertainty in predictive models based on partial differential equations.
Do the share buttons not appear? Please make sure, any blocking addon is disabled, and then reload the page.All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=ukri________::ce482431585e918c53c7ab1676434ce6&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.eu
Loading
assignment_turned_in Project2008 - 2011 LANL, Imperial College London, NSUFunder: UKRI Project Code: EP/F00947X/1Funder Contribution: 298,769 GBPThe objective of this proposal is to improve imaging techniques which exploit the interaction of waves with matter to reconstruct the physical properties of an object. To date these techniques have been limited by the tradeoff between resolution and imaging depth. While long wavelengths can penetrate deep into a medium, the classical diffraction limit precludes the possibility of observing subwavelength structures. Over the past twenty years, near-field microscopy has demonstrated that the diffraction limit can be broken by bringing a probing sensor within one wavelength distance from the surface of the object to be imaged. Now, the scope of near field microscopy has been extended to the reconstruction of subwavelength structures from measurements performed in the far-field, this approach having a much wider range of applications since often the structure to be imaged is not directly accessible. The key to subwavelength resolution lies in the physical models employed to describe wave scattering. Conventional imaging methods use the Born approximation which does not take into account the distortion of the probing wave as it travels through the medium to be imaged, so neglecting what is known as multiple scattering. On the other hand, multiple scattering is the key mechanism which can encode subwavelength information in the far-field, thus leading to a potentially unlimited resolution. New experimental evidence has shown that a resolution better than a quarter of the wavelength can be achieved for an object more than 70 wavelengths away from the probing sensors. This preliminary work has established the fundamental principle that subwavelength resolution from far-field measurements is possible as long as the Born approximation is abandoned and more accurate models for the wave-matter interaction are adopted. The aim of this proposal is to pursue this new and exciting idea and to focus on more specific applications such as medical diagnostics and geophysical imaging.
Do the share buttons not appear? Please make sure, any blocking addon is disabled, and then reload the page.All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=ukri________::23202c4c45e7bfe5be79940089a22441&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.euassignment_turned_in Project2008 - 2011 University of EssexFunder: UKRI Project Code: EP/F033818/1Funder Contribution: 364,770 GBPThe keyboard is a piece of plastic with lots of switches, which provides us with a reliable, but very unnatural form of input. The mouse is slightly less primitive. Still, it is only an electro-mechanical transducer of musculoskeletal movement. Both have been with us for many years and are still the best computer interfaces we have at the moment, yet they are unusable for people with severe musculoskeletal disorders and are themselves known causes of work-related upper-limb and back disorders: both hugely widespread problems for the UK's workforce.Wouldn't it be nice some day to be able to dispose of them and replace them with Brain-Computer Interfaces (BCIs) capable of directly interpreting the desires and intentions of computer users?This adventurous proposal aims to carry out an innovative and ambitious interdisciplinary research programme at the edge of Computer Science, Biomedical Engineering, Neuroscience and Psychology aimed at developing precisely these devices with a novel powerful BCI approach recently developed by the applicants.BCI has been a dream of researchers for many years, but developments have been slow and, with rare exceptions, BCI is still effectively a curiosity that can be used only in the lab. So, what's different about this project?In very recent research, we were able to develop a prototype BCI mouse capable of full 2-D motion control (a rarity in the BCI world), which uniquely can be used by any person without any prior training within minutes. This was possible thanks to our taking a completely innovative approach to BCI. Previous BCI designs were based on the paradigm of observing EEG signals looking for specific features or waves, manipulating them and then making a yes-or-no decision as to whether such features or waves were present. Contrary to this design wisdom, we completely dispose of the decision step and allow brain waves to directly control the computer via simple analogue transformations. Furthermore, we only partially design the system, leaving the completion and customisation of the design for each specific user to an evolutionary algorithm. This, thanks to an artificial form of Darwinian evolution inside the computer, performs the difficult tasks of selecting the best EEG channels, waves and analogue manipulations. Using these same two ingredients (analogue approach and evolutionary design) and starting from our successful experimental BCI mouse, this project specifically aims at developing brain-computer interfaces which are sufficiently robust, flexible and cheap to leave the lab and that can start making a serious impact in the real world.To maximise performance, preliminary work will determine the optimal visual presentation conditions that minimise cognitive load, perceptual errors and target-distractors interference.
Do the share buttons not appear? Please make sure, any blocking addon is disabled, and then reload the page.All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=ukri________::a5d74053180e3ab4096483e18a055434&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.euassignment_turned_in Project2008 - 2011 University of YorkFunder: UKRI Project Code: EP/F027028/1Funder Contribution: 228,108 GBPDiophantine approximation is a branch of number theory that can loosely be described as a quantitative analysis of the property that every real number can be approximated by a rational number arbitrarily closely. The theory dates back to the ancient Greeks and Chinese who used good rational approximations to the number pi (3.14159...) in order to accurately predict the position of planets and stars.The metric theory of Diophantine approximation is the study of the approximation properties of real numbers by rationals from a measure theoretic (probabilistic) point of view. The central theme is to determine whether a given approximation property holds everywhere except on an exceptional set of measure zero. In his pioneering work of 1924, Khintchine established an elegant probabilistic criterion (a `zero-one' law) in terms of Lebesgue measure for a real number to be approximable by rationals with an arbitrary decreasing (monotonic) error. The error is a function of the size of the denominators of the rational approximates and decreases as the size of the denominators increases. The monotonicity assumption is crucial since the criterion is false otherwise. Under the natural assumption that the rational approximates are reduced (i.e. in their lowest form so that the error of approximation at a rational point is determined uniquely), the Duffin-Schaeffer conjecture (1941) provides the appropriate expected statement without the monotonicity assumption. It represents one of the most famous unsolved problems in number theory. A major aim is to make significant contributions to this key conjecture by exploiting the recent `martingale' approach developed by Haynes (the named Research Assistant) and Vaaler. Furthermore, a more general form of the conjecture in which Lebesgue measure is replaced by Hausdorff measure (a fractal quantity) will be investigated. A major outcome will be the Duffin-Schaeffer conjecture for measures close to Lebesgue measure. The importance of the Duffin-Schaeffer conjecture is unquestionable. However, it does change the underlying nature of the problem considered by Khintchine in that the rational approximates are reduced. In 1971, Catlin stated a conjecture for the unconstrained problem in which the rationals are not assumed to be reduced. Catlin claimed that his conjecture was equivalent to the Duffin-Schaeffer conjecture. However, his proof contained a serious flaw and the claim remains an interesting problem in its own right. In higher dimensions, the approximation of arbitrary points in n-dimensional space by rational points (simultaneous approximation) or rational hyperplanes (dual approximation) is the natural generalisation of the one-dimensional theory. Considering a system of linear forms unifies both forms and naturally gives rise to the linear forms theory. The metric theory of Diophantine approximation is complete for simultaneous approximation in dimension greater than one. The analogues of Khintchine's criterion without any monotonicity assumption (i.e. the simultaneous Catlin conjecture) and the Duffin-Schaeffer conjecture have both been established as well as the more precise and delicate Hausdorff measure theoretic statements. However, the dual and more generally the linear forms theory are far from complete. In this proposal the linear forms analogues of the Duffin-Schaeffer and Catlin conjectures are precisely formulated. A principle goal is to establish these conjectures in dimension greater than one. A novel idea is to develop a `slicing' technique that reduces a linear forms problem to a well understood simultaneous problem. The major outcome will be a unified linear forms theory in Euclidean space.
Do the share buttons not appear? Please make sure, any blocking addon is disabled, and then reload the page.All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=ukri________::8c482aae1e161f7d223c538c79202635&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.euassignment_turned_in Project2008 - 2011 University of EdinburghFunder: UKRI Project Code: DT/F007744/1Funder Contribution: 719,422 GBPCarbon dioxide (CO2) is considered to be a greenhouse gas. The concentration of CO2 in the earth atmosphere is an important control on earth surface temperature, and hence climate. CO2 dissolution in the oceans is also being recognised as an important factor in making surface seawater unusually acid / this severely affects ecosystems and species from algae to fish and whales. Increased CO2 in the atmosphere is recognised as being partly caused by burning of fossil fuels, such as coal and gas, in power stations. Carbon Capture and Storage is a suite of technologies which enables CO2 to be captured at power stations, liquefied by increasing the pressure, transported by a pipe, and injected deep underground in to pore space of deeply buried sedimentary rocks such as sandstones. This can effectively remove CO2 from the power cycle of fossil fuel use, and store the CO2 for tens of thousands of years, which enables the earth atmosphere to return to normal. Because of the very large CO2 volumes involved, it is not possible to build surface stores. Because of the acid effects of CO2, it is not possible to inject CO2 into seawater. By contrast, the Intergovernmental Panel on Climate Change (IPCC) have calculated that more than 25% of world CO2 emissions could be stored by geological CCS. This could be a vital technology for the world's future. There is a great deal of interest worldwide in CCS and, because of the offshore oil industry, the North Sea is one of the world's prime areas for CCS to be rapidly developed. However, there are only three full-scale projects at present in the world. For UK power generating companies to become commercially interested the chain of technologies must be both demonstrated to work reliably, and must be capable of cost-effective development. This project is trying to identify aquifer sites deep underground which are close to power plant in the U.K., where CO2 can be safely stored, but sites are quicker and cheaper to develop than offshore in the North Sea. This can enable power generating companies to develop CCS over a period of years, on a medium scale, and learn to conduct the industrial operation. If this project is successful, it could lead to take up of CCS in the U.K. 10 or 15 years earlier than waiting for an infrastructure of large North Sea pipelines to be developed for CO2. When those pipes become available, UK power companies will be completely ready to connect power plant to store CO2 in large redundant hydrocarbon fields offshore. This could save many tens of million tons CO2 per year being emitted into the atmosphere from the U.K., and place the U.K. in the forefront of carbon reduction nations. The universities and companies involved in this 2.3M consortium are all experienced in investigating the deep subsurface for oil and gas production. Edinburgh, Heriot-Watt and BGS already have 1.6M from the Scottish Executive to establish the UK's largest research grouping to investigate CO2 storage. This expertise will be transferred to exploring for CO2 disposal sites. Using the information held by the British Geological Survey, maps will be made of the subsurface deep beneath England, and deep beneath the Forth estuary. Heriot-Watt university will assess the potential chemical reactions of CO2 with rock, and how much CO2 can be injected. Electricity generators, led by Scottish Power, will make engineering designs for modified power stations to supply CO2. Schlumberger and Marathon Oil, will assess the subsurface technology required for safe and reliable injection and monitoring. The University of Edinburgh will make computer simulations to determine if CO2 will leak deep below ground, and will assess how specific site is storage sites will perform to safely retain CO2. Amec will evaluate transport of CO2 by pipe. Tyndall will investigate the public attitudes at the candidate storage sites.
Do the share buttons not appear? Please make sure, any blocking addon is disabled, and then reload the page.All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=ukri________::2d18b339e4895c7f973339e6314c4b89&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.euassignment_turned_in Project2008 - 2011 University of LeedsFunder: UKRI Project Code: EP/G028427/1Funder Contribution: 98,178 GBPAtom interference has been applied in many pioneering experiments ranging from fundamental studies to precision measurements. The techniques of laser cooling and trapping have allowed the realization of bright sources of macroscopic matter waves. This project is part of a EUROCORES Collaborative Research Project (within the EuroQUASAR programme coordinated by ESF) whose goal is to build upon this expertise and use interference of quantum degenerate macroscopic matter waves for a new generation of precision measurements. Two sets of applications are envisioned: (1) Precision determination of fundamental constants and inertial forces in free space, and (2) Interferometers for trapped atoms close to the surface as a microscope for highly sensitive measurements of surface forces on the micron length scale. To achieve the ultimate sensitivity we will engineer the interactions between the atoms and create non-classical matter-wave quantum states to beat the standard quantum measurement limit. Ultracold degenerate quantum gases with their inherent coherence and narrow spread in space and momentum promise to be the ideal starting point for precision matter wave interference experiments, similar to lasers for light optics. In contrast to light, atoms interact with each other, and the physics of degenerate quantum gases is in many cases dominated by these interactions. This can be an advantage, allowing tricks from non-linear optics like squeezing to boost sensitivity, and a disadvantage, resulting in additional dephasing due to uncontrolled collisional phase shifts. We will exploit recent advances in controlling these interactions by Feshbach resonances to pick out the advantages and to suppress the disadvantages caused by the interactions. Much of the planned work will be very fundamental and exploratory as many of the capabilities together with possible limitations have yet to be investigated.The collaborative research project entitled Quantum-Degenerate Gases for Precision Measurement (QuDeGPM) focuses European efforts on precision measurements with quantum degenerate gases and in particular with Bose-Einstein condensates (BEC). The project is organized along the main objectives of (i) performing precision atom interferometry with quantum degenerate gases, (ii) using quantum degenerate gases for precision surface probing, and (iii) exploring, realizing, and testing novel measurement schemes with non-classical matter wave states. The project in Durham focuses on the use of matter-waves with tunable interactions to probe atom surface interactions. Specifically two experimental thrusts are planned. The first uses bright matter-wave solitons as the basis for a new form of matter-wave interferometry. This work connects to an existing project which began in January 2008 (EPRSC grant EP/F002068/1). The second thrust exploits condensates where the interactions are tuned to zero to study long-lived Bloch oscillations in a 1D lattice in the vicinity of a solid surface.
Do the share buttons not appear? Please make sure, any blocking addon is disabled, and then reload the page.All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=ukri________::9ef4d05736ca19a250a186b79cce8404&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.euassignment_turned_in Project2008 - 2011 Silistix Ltd, University of Salford, ST MicroelectronicsFunder: UKRI Project Code: EP/E06065X/1Funder Contribution: 319,957 GBPThe purpose of this work is to investigate an on-chip network fabric that will enable future reconfigurable computing systems integrating tens or hundreds of processing tiles implementing embedded microprocessors, intellectual property cores, reconfigurable fabrics, dedicated local memories and DSP functionality. The reconfigurable NoC fabric will direct the effective communication and exchange of data among the multiple processing tiles and enable fault-tolerance and very high communication bandwidths with low-latency and low energy consumption. The processing tiles will morph their functionality and operation point based on the application demands.
Do the share buttons not appear? Please make sure, any blocking addon is disabled, and then reload the page.All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=ukri________::4ef31012e0ae72a0937249ed5bb9fa9d&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.euassignment_turned_in Project2008 - 2011 University of Salford, UWE, Mas Networks LtdFunder: UKRI Project Code: EP/F069170/1Funder Contribution: 339,493 GBPMasonry arch bridges represent almost half of the European bridge stock, most of which are over 100 years old and carrying traffic far heavier than they have been designed for by their builders. In order to ensure public safety and maintain the bridge network, estimating the bridges' safe working loads and remaining service life are becoming pressing issues for bridge owners. The proposed project considers some of the main aspects in relation to masonry arch bridges, such as inspection, monitoring, assessment and material testing. Masonry arch bridges vary greatly in material composition, include large volumes of materials and often have unknown internal structures, that make inspection and monitoring of the bridge's overall condition increasingly problematic. Although there is a range of NDT techniques currently available, only few of these techniques have been adopted for masonry arch bridges and are able to supply useful information on the bridges' structural condition. For that reason, simple empirical procedures (such as visual observation, hammer tapping, measuring deflection at individual locations) remain till date the most widely used inspection procedures for masonry arch bridges internationally. During preliminary studies, acoustic emission technique has been found to supply highly valuable information on the condition and damage propagation of masonry arch bridges. Within the current project, experience and guidance will be provided to enable wider adaptation of the acoustic emission technique for improved condition assessment and monitoring for masonry arch bridges in the field. Assessment of masonry arch bridges generally relies on a number of subjective factors relating to the material properties, structural condition and on a limited volume of material test data for calibration. Also, most assessment techniques consider the bridge's ultimate load capacity rather than its safe/fatigue load capacity and give no indication of the remaining service life. The recently developed 'SMART' assessment method offers the possibility for the first time for estimating the bridge's safe working limits and remaining service life. While the basics for the 'SMART' assessment method have so far been developed, material test data are now needed for further development of the technique. Providing good quality material test data for the 'SMART' and other assessment methods forms the basic outset for the proposed project. A series of laboratory tests are proposed on over 500 small-scale masonry samples under fatigue loading to represent the most common forms of failure modes and material qualities for masonry arch bridges. It is however recognised, that due to the high variability in masonry properties and loading conditions, the proposed tests will only present a selection of the wide range of cases in praxis. In order to enable future tests to be incorporated into the test series, a methodology for testing, analysing and structuring test data will also be developed.The outcome of the project will a) enable bridge owners responsible for the railway, highway and waterway networks to develop asset management for their masonry arch bridge stock, b) provide tools for bridge engineers and consultants to gain deeper insight into the structural condition of masonry arch bridges, estimate the safe loading limits and residual life and c) enable researchers to develop assessment methods. The project will be undertaken in close liaison with Network Rail, Highways Agency and the European railway authorities to ensure that the needs of bridge owners are met and project outcomes are incorporated into European praxis.
Do the share buttons not appear? Please make sure, any blocking addon is disabled, and then reload the page.All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=ukri________::01dfefa17c6e9f885a188bd21407855c&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.euassignment_turned_in Project2008 - 2011 Imperial College LondonFunder: UKRI Project Code: EP/F016271/1Funder Contribution: 98,621 GBPThe past fifteen years has seen considerable research into the coupling of superconductivity and magnetism. These two effects are both mediated by coupling between electrons, but ferromagnetism leads to the parallel alignment of spins while conventional (so called spin-singlet) superconductivity requires anti-parallel spin alignment. As a result the coupling of superconductivity into ferromagnets is generally much weaker than the coupling into non-magnetic metals (the so-called proximity effect). However, at very short-range (a few nanometres) the coupling between superconductivity and ferromagnetism at the interface between the two materials results in complex behaviour which is distinct from that of either material. Most notably, the pairs of electrons which are responsible for superconductivity have a rapidly oscillating phase in the ferromagnet which can lead to negative rather than positive supercurrents appearing in devices in which a thin ferromagnetic barrier separates two superconductors. Devices based on this effect are currently being developed for quantum computation. More controversially, a few very recent experiments have detected a much longer-ranged proximity effect in which superconductivity can penetrate a ferromagnet over distances of hundreds on nanometres. This effect seems to be confirmation of theoretical predications that if the magnetism is inhomogeneous (i.e. all the spins do not point in a single direction) or the electrons are 100% spin polarised then a so-called spin-triplet state of superconductivity should appear. The aim of our proposed project is to investigate carefully the conditions required for the formation of this spin-triplet state and to understand how to control it so that potential applications can be developed. In particular we will look at classes of ferromagnet which have a spiral rather than linear magnetic order, we will also grow artificial magnetic structures in which such spirals can be changed by applying a magnetic field and we will explore how the presence of magnetic domain walls (which are regions in which the magnetism changes direction in a material) affects the superconducting properties.
Do the share buttons not appear? Please make sure, any blocking addon is disabled, and then reload the page.All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=ukri________::ac5b28b1b1520c23030cc299463fcf46&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.euassignment_turned_in Project2008 - 2011 University of CambridgeFunder: UKRI Project Code: EP/F019297/1Funder Contribution: 188,517 GBPIn principle, photovoltaic devices could meet all our energy requirements in a sustainable way, but at the moment the capital expense of conventional photovoltaics is too great to be competitive, and the volume in which they can be produced is much too small to make a serious dent in our electricity generating needs. Their relatively high manufacturing cost and the difficulty of scaling the manufacturing process is an intrinsic feature of their energy-intensive fabrication process. In contrast, non-conventional PVs based on organic semiconductors can be processed from solution using high-volume roll-to-roll printing technologies, offering the possibility of large area devices being fabricated on flexible substrates at very low cost. Unfortunately at present, organic PV devices are characterized by prohibitively low external power efficiencies (< 6%). Closing the gap in efficiency between organic and inorganic PV devices is a significant challenge / one which will require a full microscopic understanding of the processes that currently limit organic PV efficiency. The most promising organic PV devices are currently based on solution-cast blends of conjugated polymers doped with fullerene derivatives. Relatively little is however known regarding the role of the self-assembled nanoscale morphology of such systems on their operational efficiency. In this proposal, we seek to develop a comprehensive mechanistic understanding of the self-assembly processes by which nanoscale structure arises within such PV applicable materials. In particular we propose to study the evolution of nanoscale phase-separation during film casting using X-ray scattering. We will also utilize a range of complementary microscopy techniques ranging from environmental scanning electron microscopy, to time-resolved near field microscopy. The combination of such techniques will permit us to develop a complete picture of film structure from molecular to microscopic length-scales. Our proposed project draws together some of the UK's leading polymer scientists and technologists, with our goal being to significantly advance the understanding of the processes that limit organic PV device performance.
Do the share buttons not appear? Please make sure, any blocking addon is disabled, and then reload the page.All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=ukri________::b17676ed0b6304dd59c104047864a686&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.euassignment_turned_in Project2008 - 2011 ROLLS-ROYCE PLC, ANSYS, University of SouthamptonFunder: UKRI Project Code: EP/F006802/1Funder Contribution: 346,237 GBPUncertainty is ubiquitous in the mathematical characterisation of engineered and natural systems. In many structural engineering applications, a deterministic characterisation of the response may not be realistic because of uncertainty in the material constitutive laws, operating conditions, geometric variability, unmodelled behaviour, etc. Ignoring these sources of uncertainties or attempting to lump them into a factor of safety is no longer widely considered to be a rational approach, especially for high-performance and safety-critical applications. It is now increasingly acknowledged that modern computational methods must explicitly account for uncertainty and produce a certificate of response variability alongside nominal predictions. Advances in this area are key to bringing closer the promise of computational models as reliable surrogates of reality. This capability will potentially allow significant reductions in the engineering product development cycle due to decreased reliance on extensive experimental testing programs and enable the design of systems that perform robustly in the face of uncertainty. The proposed investigation will address this important research problem and deliver convergent computational methods and efficient software implementations that are orders of magnitude faster than direct Monte-Carlo simulation for predicting the response of structural systems in the presence of uncertainty. This work will draw upon developments in stochastic subspace projection theory which have recently emerged as a highly efficient and accurate alternative to existing techniques in computational stochastic mechanics. The overall objectives of this project include: (1) formulation of convergent stochastic projection schemes for predicting the static and (low and medium frequency) dynamic response statistics of large-scale stochastic structural systems. (2) design and implementation of a state-of-the-art parallel software framework that leverages existing deterministic finite element codes for stochastic analysis of complex structural systems, and (3) laboratory and computer experiments to validate the methods developed. The methods to be developed will find applications to a wide range of structural problems that require efficient and accurate predictions of performance and safety in the presence of uncertainty. This is a crucial first step towards rational design and control strategies that can meet stringent performance targets and simultaneously ensure system robustness. Progress in this area would also be of benefit to many other fields in engineering and the physical sciences where there is a pressing need to quantify uncertainty in predictive models based on partial differential equations.
Do the share buttons not appear? Please make sure, any blocking addon is disabled, and then reload the page.All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=ukri________::ce482431585e918c53c7ab1676434ce6&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.eu