Advanced search in
Projects
arrow_drop_down
Searching FieldsTerms
Any field
arrow_drop_down
includes
arrow_drop_down
26 Projects

  • 2013-2022
  • UK Research and Innovation
  • UKRI|EPSRC
  • OA Publications Mandate: No
  • 2010
  • 2016

10
arrow_drop_down
  • Funder: UKRI Project Code: TS/I002170/1
    Funder Contribution: 477,743 GBP

    This project develops an approach, genomic selection, to increase the rate at which varieties of Spring barley are developed. This is a very important crop in national agriculture, particularly for the malting, brewing and distilling industries. It is important that the rate with which improved varieties are created is increased so that more effort can be placed by breeders on improving disease resistance while maintaining or increasing grain yield and grain quality, which remain of greatest importance to growers and end users.Genomic selection represents a way of predicting traits purely from genetic markers rather than by direct measurement. These predictions require that a set of plants is first measured for the target traits so that the effect of each marker can be estimated. However, after that, selection can occur for several generations purely on markers.Direct measurement of many traits can take much longer than a single growing season: seed must first be bulked up over several generations to provide a sufficient quantity for yield trials. In contrast, marker data can be collected within the generation time of any crop and is therefore much faster than conventional selection.Other approaches to plant breeding using genetic molecular markers have been in use for many years. In these, a very small numbers of markers with strong evidence of an affect on a trait are first identified. These are then tracked through the breeding programme. Genomic selection differs in that all available markers are used to predict traits: the more markers the better. The inclusion of all markers gives more accurate prediction of overall trait values even though the precise involvement of each marker is known with less certainty.Our study has four themes. Firstly, throughout the life of the project, we shall develop new statistical methods to establish relationships between very high numbers of genetic markers and traits. The methods we develop will be more focussed on the problems of plant breeding: most methods to date have been targeted at animal breeding. Secondly, we shall test methods which are available now using historical data available from to an existing Spring barley scheme. Results will be used immediately to make selections within this scheme. We expect to register new varieties from these selections within the five year life of the project.Next, we shall use results from the analysis of the historical data together with any early methodological developments we make to create crosses specifically to exploit genomic selection. These crosses may not necessarily be the typical crosses between two parents which are commonly used by breeders but may involve more complicated crossing schemes involving, for example four parents. Within the life of the project, we shall test whether this approach gives a greater response to selection that achieved by more conventional breeding, but there will be insufficient time to resister a new variety.Finally, we shall integrate results and methods from the first three phases to completely redesign the breeding programme to get the greatest advantage out of genomic selection.In short, we plan to develop a new approach to Spring barley breeding .Genomic selection could result in a fundamental change to the way crops are bred and enable targets for increased food production and environmental sustainability to be met. Compared to other temperate crops, Spring barley has a short generation time which make it well suited to develop and test these ideas, which may also be applicable to other crops.

    more_vert
  • Funder: UKRI Project Code: EP/H040536/1
    Funder Contribution: 5,997,920 GBP

    Energy efficient processes are increasingly key priorities for ICT companies with attention being paid to both ecological and economic drivers. Although in some cases the use of ICT can be beneficial to the environment (for example by reducing journeys and introducing more efficient business processes), countries are becoming increasingly aware of the very large growth in energy consumption of telecommunications companies. For instance in 2007 BT consumed 0.7% of the UK's total electricity usage. In particular, the predicted future growth in the number of connected devices, and the internet bandwidth of an order of magnitude or two is not practical if it leads to a corresponding growth in energy consumption. Regulations may therefore come soon, particularly if Governments mandate moves towards carbon neutrality. Therefore the applicants believe that this proposal is of great importance in seeking to establish the current limits on ICT performance due to known environmental concerns and then develop new ICT techniques to provide enhanced performance. In particular they believe that substantial advances can be achieved through the innovative use of renewable sources and the development of new architectures, protocols, and algorithms operating on hardware which will itself allows significant reductions in energy consumption. This will represent a significant departure from accepted practices where ICT services are provided to meet the growing demand, without any regard for the energy consequences of relative location of supply and demand. In this project therefore, we propose innovatively to consider optimised dynamic placement of ICT services, taking account of varying energy costs at producer and consumer. Energy consumption in networks today is typically highly confined in switching and routing centres. Therefore in the project we will consider block transmission of data between centres chosen for optimum renewable energy supply as power transmission losses will often make the shipping of power to cities (data centres/switching nodes in cities) unattractive. Variable renewable sources such as solar and wind pose fresh challenges in ICT installations and network design, and hence this project will also look at innovative methods of flexible power consumption of block data routers to address this effect. We tackle the challenge along three axes: (i) We seek to design a new generation of ICT infrastructure architectures by addressing the optimisation problem of placing compute and communication resources between the producer and consumer, with the (time-varying) constraint of minimising energy costs. Here the architectures will leverage the new hardware becoming available to allow low energy operation. (ii) We seek to design new protocols and algorithms to enable communications systems to adapt their speed and power consumption according to both the user demand and energy availability. (iii) We build on recent advances in hardware which allow the block routing of data at greatly reduced energy levels over electronic techniques and determine hardware configurations (using on chip monitoring for the first time) to support these dynamic energy and communications needs. Here new network components will be developed, leveraging for example recent significant advances made on developing lower power routing hardware with routing power levels of approximately 1 mW/Gb/s for ns block switching times. In order to ensure success, different companies will engage their expertise: BT, Ericsson, Telecom New Zealand, Cisco and BBC will play a key role in supporting the development of the network architectures, provide experimental support and traffic traces, and aid standards development. Solarflare, Broadcom, Cisco and the BBC will support our protocol and intelligent traffic solutions. Avago, Broadcom and Oclaro will play a key role in the hardware development.

    visibility217
    visibilityviews217
    downloaddownloads1,183
    Powered by Usage counts
    more_vert
  • Funder: UKRI Project Code: EP/I012060/1
    Funder Contribution: 4,064,050 GBP

    Miniaturisation has become a familiar aspect of modern technology: every year, laptops get thinner, mobile phones get smaller, and computers get faster as more and more components can be accommodated on their chips. The emergence of nanoscience as a scientific discipline has been driven by the relentless quest by the electronic device industry over the past four decades for ever-faster chips. The importance of miniaturisation is not just in the fact that smaller devices can be packed more closely together, however: when objects become very small indeed, they sometimes acquire entirely new properties that larger objects formed from the same materials do not normally exhibit. Catalysts have been used for over a century to accelerate chemical reactions, and many catalysts consist of metal particles supported on ceramics. For several decades, catalytic converters in car exhausts have used metallic nanoparticles - particles a few billionths of a metre in size - to clean the exhaust gas because the catalytic activity has been found to be dramatically increased by the small size of the active metal. When semiconductors are formed into structures of the same size, they acquire entirely new optical properties purely as a consequence of their small size - for example, they glow brightly when stimulated by electrical current, and the colour of the light emitted is determined by the size of the particle (and can thus be controlled with high precision). These phenomena are referred to as low-dimensional ones: they are new, unexpected phenomena that result only from the small size of the active objects.There is a very important sense in which biological objects may also be said to be low-dimensional. Cells are tiny objects that are driven by processes that involve small numbers of molecules. Biologists have recognised that single molecules are quite different from large groups of molecules, and there has therefore been a lot of interest in studying them, because they may help us to understand much better how larger systems work. However, there are no established tools for building systems of interacting single molecules, what might be called low-dimensional systems . New tools are required to achieve this, and the goal of this programme will be to develop them.We wish to build a synthetic low-dimensional system, which will incorporate biological molecules and synthetic models for them, that replicates the photosynthetic pathway of a bacterium. Photosynthesis is the basis for all life on earth, so it has fundamental importance. However, there are important other motivations for studying the marvellously efficient processes by which biological organisms collect sunlight and use it to live, grow and reproduce. The current concerns about shortage of fossil fuels, and the problems associated with the carbon dioxide produced by burning them, make solar energy a highly attractive solution to many pressing problems. To best exploit the huge amount of solar energy that falls on the earth, even in colder climates like the UK, we may do well to learn from Nature. By building a ship-based system that replicates the photosynthetic behaviour of a biological organism, we will gain new insights into how Natural photosynthesis works. More than that, however, we will develop entirely new, biologically-inspired design principles that may be useful in understanding many other scientific and engineering problems. At a fundamental level, biological systems work quite differently from electronic devices: they are driven by complex signals, they are fuzzy and probabilistic, where microsystems are based on binary logic and are precisely determined. The construction of a functioning low-dimensional system that replicates a cellular pathway will require the adoption, in a man-made structure, of these very different design principles. If we can achieve this it may yield important new insights into how similar principles could be applied to other technologies.

    visibility523
    visibilityviews523
    downloaddownloads654
    Powered by Usage counts
    more_vert
  • Funder: UKRI Project Code: EP/I004343/1
    Funder Contribution: 1,078,760 GBP

    Light and the various ways it interacts with matter is our primary means of sensing the world around us. It is therefore no surprise that many technologies are based on light; for example submarine optical fibres make up the backbone of the Internet and display technology delivers affordable and compact crystal clear televisions. However, light itself has a limitation that we are still trying to overcome: light cannot be imaged or focused below half its wavelength, known as the diffraction limit . To see smaller objects we must use shorter wavelengths. e.g. Blue-ray, uses blue lasers (405 nm) to store more information than DVDs, which use longer wavelength red lasers (650 nm). Today, we are learning to overcome this limit by incorporating metals in optical devices. The proposed research investigates the use of metals to shatter the diffraction limit for creating new technological products, expand the capabilities of computers and the internet and deliver new sensor technologies for healthcare, defense and security.We often take for granted just how strongly light can interact with metals. Electricity, oscillating at 50 Hz (essentially very low frequency light), has a wavelength of thousands of kilometers, yet a wall-plug is no larger than a couple of inches; well below the diffraction limit! The relatively new capability to structure metal surfaces on the nanoscale now allows us to use this same phenomenon to beat the diffraction limit in the visible spectrum. Metals do this by storing energy on the electrons that collectively move in unison with light, called surface plasmons. This approach has recently re-invigorated the study of optics at the nano-scale, feeding the trend to smaller and more compact technologies.So what sets nano-optics aside from low frequency electricity if they share the same physics? I believe the paradigm of nano-optics is the capability to reduce the size of visible and infrared light so that it can occupy the same nano-scale volume as molecular, solid state and atomic electronic states for the first time. Under natural conditions the mismatch makes light-matter interactions inherently weak and slow. With nano-optics, interactions not only become stronger and faster but weak effects once difficult to detect are dramatically enhanced. This goal of this proposal is to strengthen such weak effects and utilize them to realize new capabilities in optics.With any new type of control come caveats. Firstly, it is difficult to focus light from its normal size beyond the diffraction limit. Secondly, having overcome the first challenge, light on metal surfaces is short lived due to a metal's resistance. My research plan is geared to directly address these challenges. The first thrust develops a concept that I recently proposed to mitigate the problem of energy loss to the point where surface plasmons become useful. Building on Silicon Photonics, a well-established commercial optical communications architecture, I can use established techniques to seamlessly transfer light between the realms of conventional and nano-optics with the potential for short term impact on photonics technology. The second thrust exploits my recent breakthrough on surface plasmon lasers, which can generate light directly on the nano-scale and sustain it indefinitely by laser action. This overcomes both challenges in nano-optics simultaneously. While conventional lasers transmit light over large distances, it is the light inside surface plasmon lasers that is unique. I want to use this light for spectroscopy at single molecule sensitivities. Just as ultra-fast lasers, serving as scientists' camera flash, have given us snap shots of Nature's fleeting processes, so surface plasmon lasers will allow us to probe Nature with unprecedented resolution and control at the scale of individual molecules. Exploring optics at untouched length scales is an exciting opportunity giving us the potential to make fundamentally new discoveries.

    visibility38
    visibilityviews38
    downloaddownloads226
    Powered by Usage counts
    more_vert
  • Funder: UKRI Project Code: EP/I003983/1
    Funder Contribution: 1,440,650 GBP

    Many biological processes are based on chemical reactions. Viscosity determines how fast molecules can diffuse, and react. Therefore in cells viscosity can affect signalling, transport and drug delivery, and abnormal viscosity has been linked to disease and malfunction. In spite of its importance, measuring viscosity on a scale of a single cell is a challenge. Traditionally used mechanical methods are no longer applicable and must be substituted by a spectroscopic approach. Such spectroscopic approaches exist, e.g. single particle tracking, monitoring the rate of fluorescence recovery after photobleaching, or monitoring the rate of viscosity-dependent photochemical reactions. However all of the above are single point measurements and in a complex heterogeneous environment of a cell can not provide full information. The spectroscopic approach which allows imaging or mapping of viscosity would be of great benefit. This proposal aims to measure and map viscosity inside a single cell with high precision and high spatial resolution using novel fluorescent probes, called molecular rotors. In molecular rotors fluorescence competes with intramolecular rotation. In a viscous environment rotation is slowed down and this strongly affects fluorescence. Thus viscosity can be measured by detecting the change in either the fluorescence spectra or lifetimes. Existing technology allows imaging of either the fluorescent spectra or lifetimes with excellent spatial resolution in single live cells. To date we have produced maps of viscosity in certain parts of cells using this approach and demonstrated that local viscosity in those compartments can be up to 100x higher than that of water.Important advantage of molecular rotor approach is a very short measurement time. Using this advantage, this proposal aims to monitor how viscosity in a cell changes during dynamic biological processes, e.g. change in the membrane structure upon cell perturbation, drug administration and cell death.Photodynamic therapy (PDT) is a form of cancer treatment, which relies on the generation of short-lived toxic agents within a cell upon irradiation of a drug. The efficacy of this treatment critically depends on the viscosity of the medium through which the cytotoxic agent must diffuse during its short life span. This proposal will monitor how cell viscosity and other vital biophysical cell parameters change during PDT. The novelty of our approach is in using spatially resolved irradiation of the drug within cells. E.g. we can irradiate a single organelle and monitor the change in the entire cell. Alternatively, we can irradiate the group of cells and monitor the behaviour of its neighbours. This approach is ideal tool to directly probe the 'bystander effect', when the cells which have not been directly treated show significant response to therapy, the effect which is very important in radiation and PDT cancer treatment. This proposal will be carried out in the Chemistry Department at Imperial College London where multidisciplinary collaborations are established to ensure the success of the work proposed. This project will address both the fundamental scientific issues in photochemistry and cell biology and also encourage the development of applications, such as measuring viscosity as a diagnostic tool and for monitoring the progress of treatments.

    visibility353
    visibilityviews353
    downloaddownloads699
    Powered by Usage counts
    more_vert
  • Funder: UKRI Project Code: NS/A000011/1
    Funder Contribution: 1,341,660 GBP

    Abstracts are not currently available in GtR for all funded research. This is normally because the abstract was not required at the time of proposal submission, but may be because it included sensitive information such as personal details.

    more_vert
  • Funder: UKRI Project Code: EP/I00548X/1
    Funder Contribution: 2,016,330 GBP

    Despite the changing face of science, the importance of synthesis - the ability to make molecules - has not diminished. To solve the increasingly complex synthetic problems posed by Nature, medicine and materials, we must question the dogma that defines what we know about making organic molecules. This proposal seeks to address the 'synthesis grand challenge' to develop a new blueprint for chemical synthesis that will revolutionize the way that molecules are made in response to societies needs. In contrast to conventional synthesis, that often requires numerous chemical operations to link two molecules together, we will activate traditionally inert, but ubiquitous, carbon-hydrogen (C-H) chemical bonds with metal catalysts and transform them directly into a useful chemical architecture thereby streamlining the synthesis of natural products, medicines and materials. This will impact broadly in academia, industry and across modern society, providing (a) better ways of making molecules, (b) cheaper medicines through accelerated drug discovery, (c) advances in materials and chemical biology through chemical modification of polymers and proteins, (d) potential advances in energy related research through understanding the mechanism of hydrocarbon oxidation, and (e) an enhanced chemistry knowledge base.

    more_vert
  • Funder: UKRI Project Code: EP/H021779/1
    Funder Contribution: 3,344,520 GBP

    The Evolution and Resilience of Industrial Ecosystems programme (ERIE) will address a series of fundamental questions relating to the application of complexity science to social and economic systems. Our programme of research aims to embed cutting-edge complexity science methods and techniques within prototype computational tools that will provide policymakers with realistic and reliable platforms for strategy-testing in real-world socio-economic systems. The programme includes the gathering of data from case studies, the development and application of appropriate theoretical and computational techniques, simulation using agent-based models and the incorporation of all these elements into 'serious games' for use by policymakers. We will study the negotiation of policy goals and options, explore the role of models in policymaking and involve policymakers in the design and testing of our strategy tools.The programme will focus on a crucial aspect of the UK economy: the way in which firms are interdependent on each other, with the interrelationships being multi-level and multi-valued. Within an industrial 'ecosystem', there can be relationships of supply and demand; the transfer of knowledge; competition for labour; the transfer of materials down supply chains; negotiation over standards; collaboration in trade associations and unions; and innovation, product differentiation and branding. We will use mathematical and computational approaches to model these layered, nested, multiscale systems, where the links between actors are dynamic and the exchanges between them are unpredictable, fluctuating and perhaps sporadic. Within this context we will examine concepts and measures of resilience (the ability to recover from external shocks), emergence (the ways in which social institutions arise from individual activities) and immergence (the ways in which individuals react to institutional constraints). This leads us to some of the most intriguing open questions of complexity science. We will seek answers inspired by the real-world industrial ecosystems as captured in our case studies. Our vision is to provide models of multi-level socio-economic systems that are useful for decision-makers aiming to 'steer' towards policy-relevant goals. It is not our intention to provide 'the' policy solution to policy problems (specifically, it is not our intention just to show how particular ecosystems may be made more resilient or more sustainable), but rather to provide a suite of tools which will allow decision makers and their representatives to investigate alternative scenarios given a set of assumptions and initial conditions.We will apply the methods of data assimilation, largely developed in the context of weather forecasting, to incorporate the inevitably incomplete data from case studies into agent-based models, on an ongoing basis, with the aim of providing 'predictive' tools that are continually updated with real-world data. By 'prediction' here we mean the identification of alternative scenarios along with estimates of the probability that each will be realised over given time frames, and estimates of the sensitivity of these to uncertainties in the data and underlying model. It is an integral part of ERIE to study - and involve - those involved in the case study sites. One research stream is concerned with studying those with a stake in the system, as controllers, decision makers, customers, workers, etc., their goals, policy options and their links with the industrial ecosystems that they are interacting with.The research programme is divided into four streams, each consisting of a number of cross-disciplinary projects. Four post-doctoral researchers and a project officer will work on the programme, with seven Investigators from the disciplines of mathematics, computing science, environmental science and sociology, and 9 PhD research students, the latter funded from internal University of Surrey resources.

    visibility23
    visibilityviews23
    downloaddownloads205
    Powered by Usage counts
    more_vert
  • Funder: UKRI Project Code: EP/I001514/1
    Funder Contribution: 5,346,470 GBP

    The term material is extremely broad, so for simplicity's sake, materials are often described as either hard or soft . While hard materials such as ceramics are strong, they are often brittle. In contrast, soft materials such as polymers are often mechanically weak, but can show valuable elastic properties. Combining these two in one new composite can therefore give rise to remarkable new materials, which benefit from the advantages of both components. This is just one benefit of combining hard and soft materials. In fact, interaction between hard and soft materials occurs in all walks of life. Whether a medical implant is accepted in the body depends on how cells recognise and interact with the hard implant surface. Controlling this requires that we understand molecular-scale processes, which govern how soft biomolecules interact with surfaces - and also processes occurring on much larger length-scales, most importantly how cells interact and recognise a hard surface. In this case, the soft matter must adapt to the hard surface, potentially changing its shape and chemical properties. This is important for many applications - from the toxicology of nanoparticles to strategies for environmental remediation. Perhaps surprisingly, it is not only hard materials which control the soft - the converse also occurs. Biomineralisation - the formation of mineral structures such as bones, teeth and seashells by organisms - shows this beautifully. It is through interaction of growing minerals with soft, organic matter that Nature produces these materials with their remarkable shapes and properties. Biominerals are often very different from synthetic minerals. While a crystal of calcite (calcium carbonate) precipitated in the lab has a regular, geometric form, in the spines of a sea urchin a calcite single crystal is sponge-like, with curved surfaces replacing flat crystal planes. Biominerals are also almost always composites - soft organic molecules are embedded within the crystal. It is this structure which gives biominerals such wonderful mechanical properties - indeed, tooth enamel is one of the hardest materials known. Soft matter not only affects the properties of biominerals, but controls almost every stage of their formation - from the earliest stages of nucleation, through growth, to production of the final biomineral. Insoluble organic molecules define the special environments in which biominerals form and nucleate, while small, soluble organic molecules bind to a crystal during growth, influencing its shape. Clearly, understanding how soft and hard materials interact and control each other is of great importance, and has applications spanning disciplines from medicine to geology, from climate science to nanotechnology. The strategies used by biology to produce biominerals can be applied to the design and fabrication of new materials - where the structure can be controlled at the atomic scale, and the synthesis carried out under mild conditions. If we can design molecules to attach to surfaces strongly, we can use them to inhibit crystal growth. Crystals growing where they should not - in boilers, heating systems and oil wells - remains a major problem in industry and domestic life. Finally, many biomaterials are carbonates. They are a part of the planet's carbon cycle - a major way in which carbon dioxide is removed from the atmosphere for long periods. In the oceans, structures such as coral reefs are under threat due to changes in oceanic conditions; we need to understand the mechanisms of their growth to understand fully why. Removing carbon dioxide from the atmosphere and converting it into carbonates is a possible carbon capture strategy. The research carried out in this grant will use both experiment and theory in a unique way to shed light on the fundamental mechanisms behind this most fascinating and essential capability of the biosphere and to harness this knowledge to develop of novel materials.

    visibility808
    visibilityviews808
    downloaddownloads2,319
    Powered by Usage counts
    more_vert
  • Funder: UKRI Project Code: EP/H02171X/1
    Funder Contribution: 2,710,230 GBP

    Ours has been dubbed the 'age of migration'. Immigration is a major political issue, with increasing media coverage, rising anti-immigration sentiment and the rise of anti-immigration political parties. The issue of migration sits centrally within the wider debate about ethnic and religious diversity and its effects on social cohesion. We are still, though, a long way from understanding these issues and their potential consequences. They seem to rest on beliefs about national identity and ethnicity, but cannot be divorced from the effects of social class, education, economic competition and inequality, as well as the influences of geographical and social segregation, social structures and institutions.This project will integrate the two very different disciplines, social science and complexity science, in order to gain new understanding of these complex, social issues. It will do this by building a series of computer simulation models of these social processes. One could think these as Serious Sims programmes that track the social interactions between many individuals. Such simulations allow 'what if' experiments to be performed so that a deeper understanding of the possible outcomes for the society as a whole can be established based on the interactions of many individuals. A difficulty with the computer simulation of complex systems is that if they are made realistic (in the sense of how people actually behave) it becomes very complex, which makes the simulation hard to understand, whilst if they are made simple enough to understand they can be too abstract to mean anything useful in terms of real people. This project aims to get around this by making chains of related models, starting with a complex, 'descriptive' model and then simplifying in stages, so that each simulation is a model of the one below it. The simpler models help us understand what is going on in the more complex ones. The more complex models reveal in what ways the simpler ones are accurate as well as the ways they over-simplify. In this way this project will combine the relevance of social science with the rigour of the hard sciences, but at the cost of having to build, check and maintain whole chains of models.Building on an established collaboration between social and complexity scientists in Manchester, this project will integrate the two disciplines to produce new insights, techniques and approaches for policy makers and their advisors. However this will require both the complexity and social scientists to develop new techniques. The complexity scientists will develop new families of computer models that capture several aspects of society in one simulation, including: how the membership of different groups, origins, classes, etc. are signalled by people (e.g. the way they dress, or their attitudes); the advantages and disadvantages of belonging to several different social groups at the same time; how different but parallel social networks might relate to each other; and how the views of people on specific issues might change in response to their friends, wider group and even politicians. The social scientists will develop ways of relating these kinds of models to the rich sources of social data that are available, and will collect additional social data where these sources prove inadequate. They will also ensure that the modelling results are interpreted meaningfully and usefully, in particular in ensuring that they are not over-interpreted. By bringing together the social science evidence, the layers of simulation models and the combined expertise of the researchers this project aims to make real progress in understanding the complex, important yet sensitive issues surrounding the processes that underlie the effects of immigration and diversity on social cohesion and integration. From the beginning it will involve policy experts and decision makers to help guide the project and ensure its relevance.

    visibility28
    visibilityviews28
    downloaddownloads64
    Powered by Usage counts
    more_vert
Advanced search in
Projects
arrow_drop_down
Searching FieldsTerms
Any field
arrow_drop_down
includes
arrow_drop_down
26 Projects
  • Funder: UKRI Project Code: TS/I002170/1
    Funder Contribution: 477,743 GBP

    This project develops an approach, genomic selection, to increase the rate at which varieties of Spring barley are developed. This is a very important crop in national agriculture, particularly for the malting, brewing and distilling industries. It is important that the rate with which improved varieties are created is increased so that more effort can be placed by breeders on improving disease resistance while maintaining or increasing grain yield and grain quality, which remain of greatest importance to growers and end users.Genomic selection represents a way of predicting traits purely from genetic markers rather than by direct measurement. These predictions require that a set of plants is first measured for the target traits so that the effect of each marker can be estimated. However, after that, selection can occur for several generations purely on markers.Direct measurement of many traits can take much longer than a single growing season: seed must first be bulked up over several generations to provide a sufficient quantity for yield trials. In contrast, marker data can be collected within the generation time of any crop and is therefore much faster than conventional selection.Other approaches to plant breeding using genetic molecular markers have been in use for many years. In these, a very small numbers of markers with strong evidence of an affect on a trait are first identified. These are then tracked through the breeding programme. Genomic selection differs in that all available markers are used to predict traits: the more markers the better. The inclusion of all markers gives more accurate prediction of overall trait values even though the precise involvement of each marker is known with less certainty.Our study has four themes. Firstly, throughout the life of the project, we shall develop new statistical methods to establish relationships between very high numbers of genetic markers and traits. The methods we develop will be more focussed on the problems of plant breeding: most methods to date have been targeted at animal breeding. Secondly, we shall test methods which are available now using historical data available from to an existing Spring barley scheme. Results will be used immediately to make selections within this scheme. We expect to register new varieties from these selections within the five year life of the project.Next, we shall use results from the analysis of the historical data together with any early methodological developments we make to create crosses specifically to exploit genomic selection. These crosses may not necessarily be the typical crosses between two parents which are commonly used by breeders but may involve more complicated crossing schemes involving, for example four parents. Within the life of the project, we shall test whether this approach gives a greater response to selection that achieved by more conventional breeding, but there will be insufficient time to resister a new variety.Finally, we shall integrate results and methods from the first three phases to completely redesign the breeding programme to get the greatest advantage out of genomic selection.In short, we plan to develop a new approach to Spring barley breeding .Genomic selection could result in a fundamental change to the way crops are bred and enable targets for increased food production and environmental sustainability to be met. Compared to other temperate crops, Spring barley has a short generation time which make it well suited to develop and test these ideas, which may also be applicable to other crops.

    more_vert
  • Funder: UKRI Project Code: EP/H040536/1
    Funder Contribution: 5,997,920 GBP

    Energy efficient processes are increasingly key priorities for ICT companies with attention being paid to both ecological and economic drivers. Although in some cases the use of ICT can be beneficial to the environment (for example by reducing journeys and introducing more efficient business processes), countries are becoming increasingly aware of the very large growth in energy consumption of telecommunications companies. For instance in 2007 BT consumed 0.7% of the UK's total electricity usage. In particular, the predicted future growth in the number of connected devices, and the internet bandwidth of an order of magnitude or two is not practical if it leads to a corresponding growth in energy consumption. Regulations may therefore come soon, particularly if Governments mandate moves towards carbon neutrality. Therefore the applicants believe that this proposal is of great importance in seeking to establish the current limits on ICT performance due to known environmental concerns and then develop new ICT techniques to provide enhanced performance. In particular they believe that substantial advances can be achieved through the innovative use of renewable sources and the development of new architectures, protocols, and algorithms operating on hardware which will itself allows significant reductions in energy consumption. This will represent a significant departure from accepted practices where ICT services are provided to meet the growing demand, without any regard for the energy consequences of relative location of supply and demand. In this project therefore, we propose innovatively to consider optimised dynamic placement of ICT services, taking account of varying energy costs at producer and consumer. Energy consumption in networks today is typically highly confined in switching and routing centres. Therefore in the project we will consider block transmission of data between centres chosen for optimum renewable energy supply as power transmission losses will often make the shipping of power to cities (data centres/switching nodes in cities) unattractive. Variable renewable sources such as solar and wind pose fresh challenges in ICT installations and network design, and hence this project will also look at innovative methods of flexible power consumption of block data routers to address this effect. We tackle the challenge along three axes: (i) We seek to design a new generation of ICT infrastructure architectures by addressing the optimisation problem of placing compute and communication resources between the producer and consumer, with the (time-varying) constraint of minimising energy costs. Here the architectures will leverage the new hardware becoming available to allow low energy operation. (ii) We seek to design new protocols and algorithms to enable communications systems to adapt their speed and power consumption according to both the user demand and energy availability. (iii) We build on recent advances in hardware which allow the block routing of data at greatly reduced energy levels over electronic techniques and determine hardware configurations (using on chip monitoring for the first time) to support these dynamic energy and communications needs. Here new network components will be developed, leveraging for example recent significant advances made on developing lower power routing hardware with routing power levels of approximately 1 mW/Gb/s for ns block switching times. In order to ensure success, different companies will engage their expertise: BT, Ericsson, Telecom New Zealand, Cisco and BBC will play a key role in supporting the development of the network architectures, provide experimental support and traffic traces, and aid standards development. Solarflare, Broadcom, Cisco and the BBC will support our protocol and intelligent traffic solutions. Avago, Broadcom and Oclaro will play a key role in the hardware development.

    visibility217
    visibilityviews217
    downloaddownloads1,183
    Powered by Usage counts
    more_vert
  • Funder: UKRI Project Code: EP/I012060/1
    Funder Contribution: 4,064,050 GBP

    Miniaturisation has become a familiar aspect of modern technology: every year, laptops get thinner, mobile phones get smaller, and computers get faster as more and more components can be accommodated on their chips. The emergence of nanoscience as a scientific discipline has been driven by the relentless quest by the electronic device industry over the past four decades for ever-faster chips. The importance of miniaturisation is not just in the fact that smaller devices can be packed more closely together, however: when objects become very small indeed, they sometimes acquire entirely new properties that larger objects formed from the same materials do not normally exhibit. Catalysts have been used for over a century to accelerate chemical reactions, and many catalysts consist of metal particles supported on ceramics. For several decades, catalytic converters in car exhausts have used metallic nanoparticles - particles a few billionths of a metre in size - to clean the exhaust gas because the catalytic activity has been found to be dramatically increased by the small size of the active metal. When semiconductors are formed into structures of the same size, they acquire entirely new optical properties purely as a consequence of their small size - for example, they glow brightly when stimulated by electrical current, and the colour of the light emitted is determined by the size of the particle (and can thus be controlled with high precision). These phenomena are referred to as low-dimensional ones: they are new, unexpected phenomena that result only from the small size of the active objects.There is a very important sense in which biological objects may also be said to be low-dimensional. Cells are tiny objects that are driven by processes that involve small numbers of molecules. Biologists have recognised that single molecules are quite different from large groups of molecules, and there has therefore been a lot of interest in studying them, because they may help us to understand much better how larger systems work. However, there are no established tools for building systems of interacting single molecules, what might be called low-dimensional systems . New tools are required to achieve this, and the goal of this programme will be to develop them.We wish to build a synthetic low-dimensional system, which will incorporate biological molecules and synthetic models for them, that replicates the photosynthetic pathway of a bacterium. Photosynthesis is the basis for all life on earth, so it has fundamental importance. However, there are important other motivations for studying the marvellously efficient processes by which biological organisms collect sunlight and use it to live, grow and reproduce. The current concerns about shortage of fossil fuels, and the problems associated with the carbon dioxide produced by burning them, make solar energy a highly attractive solution to many pressing problems. To best exploit the huge amount of solar energy that falls on the earth, even in colder climates like the UK, we may do well to learn from Nature. By building a ship-based system that replicates the photosynthetic behaviour of a biological organism, we will gain new insights into how Natural photosynthesis works. More than that, however, we will develop entirely new, biologically-inspired design principles that may be useful in understanding many other scientific and engineering problems. At a fundamental level, biological systems work quite differently from electronic devices: they are driven by complex signals, they are fuzzy and probabilistic, where microsystems are based on binary logic and are precisely determined. The construction of a functioning low-dimensional system that replicates a cellular pathway will require the adoption, in a man-made structure, of these very different design principles. If we can achieve this it may yield important new insights into how similar principles could be applied to other technologies.

    visibility523
    visibilityviews523
    downloaddownloads654
    Powered by Usage counts
    more_vert
  • Funder: UKRI Project Code: EP/I004343/1
    Funder Contribution: 1,078,760 GBP

    Light and the various ways it interacts with matter is our primary means of sensing the world around us. It is therefore no surprise that many technologies are based on light; for example submarine optical fibres make up the backbone of the Internet and display technology delivers affordable and compact crystal clear televisions. However, light itself has a limitation that we are still trying to overcome: light cannot be imaged or focused below half its wavelength, known as the diffraction limit . To see smaller objects we must use shorter wavelengths. e.g. Blue-ray, uses blue lasers (405 nm) to store more information than DVDs, which use longer wavelength red lasers (650 nm). Today, we are learning to overcome this limit by incorporating metals in optical devices. The proposed research investigates the use of metals to shatter the diffraction limit for creating new technological products, expand the capabilities of computers and the internet and deliver new sensor technologies for healthcare, defense and security.We often take for granted just how strongly light can interact with metals. Electricity, oscillating at 50 Hz (essentially very low frequency light), has a wavelength of thousands of kilometers, yet a wall-plug is no larger than a couple of inches; well below the diffraction limit! The relatively new capability to structure metal surfaces on the nanoscale now allows us to use this same phenomenon to beat the diffraction limit in the visible spectrum. Metals do this by storing energy on the electrons that collectively move in unison with light, called surface plasmons. This approach has recently re-invigorated the study of optics at the nano-scale, feeding the trend to smaller and more compact technologies.So what sets nano-optics aside from low frequency electricity if they share the same physics? I believe the paradigm of nano-optics is the capability to reduce the size of visible and infrared light so that it can occupy the same nano-scale volume as molecular, solid state and atomic electronic states for the first time. Under natural conditions the mismatch makes light-matter interactions inherently weak and slow. With nano-optics, interactions not only become stronger and faster but weak effects once difficult to detect are dramatically enhanced. This goal of this proposal is to strengthen such weak effects and utilize them to realize new capabilities in optics.With any new type of control come caveats. Firstly, it is difficult to focus light from its normal size beyond the diffraction limit. Secondly, having overcome the first challenge, light on metal surfaces is short lived due to a metal's resistance. My research plan is geared to directly address these challenges. The first thrust develops a concept that I recently proposed to mitigate the problem of energy loss to the point where surface plasmons become useful. Building on Silicon Photonics, a well-established commercial optical communications architecture, I can use established techniques to seamlessly transfer light between the realms of conventional and nano-optics with the potential for short term impact on photonics technology. The second thrust exploits my recent breakthrough on surface plasmon lasers, which can generate light directly on the nano-scale and sustain it indefinitely by laser action. This overcomes both challenges in nano-optics simultaneously. While conventional lasers transmit light over large distances, it is the light inside surface plasmon lasers that is unique. I want to use this light for spectroscopy at single molecule sensitivities. Just as ultra-fast lasers, serving as scientists' camera flash, have given us snap shots of Nature's fleeting processes, so surface plasmon lasers will allow us to probe Nature with unprecedented resolution and control at the scale of individual molecules. Exploring optics at untouched length scales is an exciting opportunity giving us the potential to make fundamentally new discoveries.

    visibility38
    visibilityviews38
    downloaddownloads226
    Powered by Usage counts
    more_vert
  • Funder: UKRI Project Code: EP/I003983/1
    Funder Contribution: 1,440,650 GBP

    Many biological processes are based on chemical reactions. Viscosity determines how fast molecules can diffuse, and react. Therefore in cells viscosity can affect signalling, transport and drug delivery, and abnormal viscosity has been linked to disease and malfunction. In spite of its importance, measuring viscosity on a scale of a single cell is a challenge. Traditionally used mechanical methods are no longer applicable and must be substituted by a spectroscopic approach. Such spectroscopic approaches exist, e.g. single particle tracking, monitoring the rate of fluorescence recovery after photobleaching, or monitoring the rate of viscosity-dependent photochemical reactions. However all of the above are single point measurements and in a complex heterogeneous environment of a cell can not provide full information. The spectroscopic approach which allows imaging or mapping of viscosity would be of great benefit. This proposal aims to measure and map viscosity inside a single cell with high precision and high spatial resolution using novel fluorescent probes, called molecular rotors. In molecular rotors fluorescence competes with intramolecular rotation. In a viscous environment rotation is slowed down and this strongly affects fluorescence. Thus viscosity can be measured by detecting the change in either the fluorescence spectra or lifetimes. Existing technology allows imaging of either the fluorescent spectra or lifetimes with excellent spatial resolution in single live cells. To date we have produced maps of viscosity in certain parts of cells using this approach and demonstrated that local viscosity in those compartments can be up to 100x higher than that of water.Important advantage of molecular rotor approach is a very short measurement time. Using this advantage, this proposal aims to monitor how viscosity in a cell changes during dynamic biological processes, e.g. change in the membrane structure upon cell perturbation, drug administration and cell death.Photodynamic therapy (PDT) is a form of cancer treatment, which relies on the generation of short-lived toxic agents within a cell upon irradiation of a drug. The efficacy of this treatment critically depends on the viscosity of the medium through which the cytotoxic agent must diffuse during its short life span. This proposal will monitor how cell viscosity and other vital biophysical cell parameters change during PDT. The novelty of our approach is in using spatially resolved irradiation of the drug within cells. E.g. we can irradiate a single organelle and monitor the change in the entire cell. Alternatively, we can irradiate the group of cells and monitor the behaviour of its neighbours. This approach is ideal tool to directly probe the 'bystander effect', when the cells which have not been directly treated show significant response to therapy, the effect which is very important in radiation and PDT cancer treatment. This proposal will be carried out in the Chemistry Department at Imperial College London where multidisciplinary collaborations are established to ensure the success of the work proposed. This project will address both the fundamental scientific issues in photochemistry and cell biology and also encourage the development of applications, such as measuring viscosity as a diagnostic tool and for monitoring the progress of treatments.

    visibility353
    visibilityviews353
    downloaddownloads699
    Powered by Usage counts
    more_vert
  • Funder: UKRI Project Code: NS/A000011/1
    Funder Contribution: 1,341,660 GBP

    Abstracts are not currently available in GtR for all funded research. This is normally because the abstract was not required at the time of proposal submission, but may be because it included sensitive information such as personal details.

    more_vert
  • Funder: UKRI Project Code: EP/I00548X/1
    Funder Contribution: 2,016,330 GBP

    Despite the changing face of science, the importance of synthesis - the ability to make molecules - has not diminished. To solve the increasingly complex synthetic problems posed by Nature, medicine and materials, we must question the dogma that defines what we know about making organic molecules. This proposal seeks to address the 'synthesis grand challenge' to develop a new blueprint for chemical synthesis that will revolutionize the way that molecules are made in response to societies needs. In contrast to conventional synthesis, that often requires numerous chemical operations to link two molecules together, we will activate traditionally inert, but ubiquitous, carbon-hydrogen (C-H) chemical bonds with metal catalysts and transform them directly into a useful chemical architecture thereby streamlining the synthesis of natural products, medicines and materials. This will impact broadly in academia, industry and across modern society, providing (a) better ways of making molecules, (b) cheaper medicines through accelerated drug discovery, (c) advances in materials and chemical biology through chemical modification of polymers and proteins, (d) potential advances in energy related research through understanding the mechanism of hydrocarbon oxidation, and (e) an enhanced chemistry knowledge base.

    more_vert
  • Funder: UKRI Project Code: EP/H021779/1
    Funder Contribution: 3,344,520 GBP

    The Evolution and Resilience of Industrial Ecosystems programme (ERIE) will address a series of fundamental questions relating to the application of complexity science to social and economic systems. Our programme of research aims to embed cutting-edge complexity science methods and techniques within prototype computational tools that will provide policymakers with realistic and reliable platforms for strategy-testing in real-world socio-economic systems. The programme includes the gathering of data from case studies, the development and application of appropriate theoretical and computational techniques, simulation using agent-based models and the incorporation of all these elements into 'serious games' for use by policymakers. We will study the negotiation of policy goals and options, explore the role of models in policymaking and involve policymakers in the design and testing of our strategy tools.The programme will focus on a crucial aspect of the UK economy: the way in which firms are interdependent on each other, with the interrelationships being multi-level and multi-valued. Within an industrial 'ecosystem', there can be relationships of supply and demand; the transfer of knowledge; competition for labour; the transfer of materials down supply chains; negotiation over standards; collaboration in trade associations and unions; and innovation, product differentiation and branding. We will use mathematical and computational approaches to model these layered, nested, multiscale systems, where the links between actors are dynamic and the exchanges between them are unpredictable, fluctuating and perhaps sporadic. Within this context we will examine concepts and measures of resilience (the ability to recover from external shocks), emergence (the ways in which social institutions arise from individual activities) and immergence (the ways in which individuals react to institutional constraints). This leads us to some of the most intriguing open questions of complexity science. We will seek answers inspired by the real-world industrial ecosystems as captured in our case studies. Our vision is to provide models of multi-level socio-economic systems that are useful for decision-makers aiming to 'steer' towards policy-relevant goals. It is not our intention to provide 'the' policy solution to policy problems (specifically, it is not our intention just to show how particular ecosystems may be made more resilient or more sustainable), but rather to provide a suite of tools which will allow decision makers and their representatives to investigate alternative scenarios given a set of assumptions and initial conditions.We will apply the methods of data assimilation, largely developed in the context of weather forecasting, to incorporate the inevitably incomplete data from case studies into agent-based models, on an ongoing basis, with the aim of providing 'predictive' tools that are continually updated with real-world data. By 'prediction' here we mean the identification of alternative scenarios along with estimates of the probability that each will be realised over given time frames, and estimates of the sensitivity of these to uncertainties in the data and underlying model. It is an integral part of ERIE to study - and involve - those involved in the case study sites. One research stream is concerned with studying those with a stake in the system, as controllers, decision makers, customers, workers, etc., their goals, policy options and their links with the industrial ecosystems that they are interacting with.The research programme is divided into four streams, each consisting of a number of cross-disciplinary projects. Four post-doctoral researchers and a project officer will work on the programme, with seven Investigators from the disciplines of mathematics, computing science, environmental science and sociology, and 9 PhD research students, the latter funded from internal University of Surrey resources.

    visibility23
    visibilityviews23
    downloaddownloads205
    Powered by Usage counts
    more_vert
  • Funder: UKRI Project Code: EP/I001514/1
    Funder Contribution: 5,346,470 GBP

    The term material is extremely broad, so for simplicity's sake, materials are often described as either hard or soft . While hard materials such as ceramics are strong, they are often brittle. In contrast, soft materials such as polymers are often mechanically weak, but can show valuable elastic properties. Combining these two in one new composite can therefore give rise to remarkable new materials, which benefit from the advantages of both components. This is just one benefit of combining hard and soft materials. In fact, interaction between hard and soft materials occurs in all walks of life. Whether a medical implant is accepted in the body depends on how cells recognise and interact with the hard implant surface. Controlling this requires that we understand molecular-scale processes, which govern how soft biomolecules interact with surfaces - and also processes occurring on much larger length-scales, most importantly how cells interact and recognise a hard surface. In this case, the soft matter must adapt to the hard surface, potentially changing its shape and chemical properties. This is important for many applications - from the toxicology of nanoparticles to strategies for environmental remediation. Perhaps surprisingly, it is not only hard materials which control the soft - the converse also occurs. Biomineralisation - the formation of mineral structures such as bones, teeth and seashells by organisms - shows this beautifully. It is through interaction of growing minerals with soft, organic matter that Nature produces these materials with their remarkable shapes and properties. Biominerals are often very different from synthetic minerals. While a crystal of calcite (calcium carbonate) precipitated in the lab has a regular, geometric form, in the spines of a sea urchin a calcite single crystal is sponge-like, with curved surfaces replacing flat crystal planes. Biominerals are also almost always composites - soft organic molecules are embedded within the crystal. It is this structure which gives biominerals such wonderful mechanical properties - indeed, tooth enamel is one of the hardest materials known. Soft matter not only affects the properties of biominerals, but controls almost every stage of their formation - from the earliest stages of nucleation, through growth, to production of the final biomineral. Insoluble organic molecules define the special environments in which biominerals form and nucleate, while small, soluble organic molecules bind to a crystal during growth, influencing its shape. Clearly, understanding how soft and hard materials interact and control each other is of great importance, and has applications spanning disciplines from medicine to geology, from climate science to nanotechnology. The strategies used by biology to produce biominerals can be applied to the design and fabrication of new materials - where the structure can be controlled at the atomic scale, and the synthesis carried out under mild conditions. If we can design molecules to attach to surfaces strongly, we can use them to inhibit crystal growth. Crystals growing where they should not - in boilers, heating systems and oil wells - remains a major problem in industry and domestic life. Finally, many biomaterials are carbonates. They are a part of the planet's carbon cycle - a major way in which carbon dioxide is removed from the atmosphere for long periods. In the oceans, structures such as coral reefs are under threat due to changes in oceanic conditions; we need to understand the mechanisms of their growth to understand fully why. Removing carbon dioxide from the atmosphere and converting it into carbonates is a possible carbon capture strategy. The research carried out in this grant will use both experiment and theory in a unique way to shed light on the fundamental mechanisms behind this most fascinating and essential capability of the biosphere and to harness this knowledge to develop of novel materials.

    visibility808
    visibilityviews808
    downloaddownloads2,319
    Powered by Usage counts
    more_vert
  • Funder: UKRI Project Code: EP/H02171X/1
    Funder Contribution: 2,710,230 GBP

    Ours has been dubbed the 'age of migration'. Immigration is a major political issue, with increasing media coverage, rising anti-immigration sentiment and the rise of anti-immigration political parties. The issue of migration sits centrally within the wider debate about ethnic and religious diversity and its effects on social cohesion. We are still, though, a long way from understanding these issues and their potential consequences. They seem to rest on beliefs about national identity and ethnicity, but cannot be divorced from the effects of social class, education, economic competition and inequality, as well as the influences of geographical and social segregation, social structures and institutions.This project will integrate the two very different disciplines, social science and complexity science, in order to gain new understanding of these complex, social issues. It will do this by building a series of computer simulation models of these social processes. One could think these as Serious Sims programmes that track the social interactions between many individuals. Such simulations allow 'what if' experiments to be performed so that a deeper understanding of the possible outcomes for the society as a whole can be established based on the interactions of many individuals. A difficulty with the computer simulation of complex systems is that if they are made realistic (in the sense of how people actually behave) it becomes very complex, which makes the simulation hard to understand, whilst if they are made simple enough to understand they can be too abstract to mean anything useful in terms of real people. This project aims to get around this by making chains of related models, starting with a complex, 'descriptive' model and then simplifying in stages, so that each simulation is a model of the one below it. The simpler models help us understand what is going on in the more complex ones. The more complex models reveal in what ways the simpler ones are accurate as well as the ways they over-simplify. In this way this project will combine the relevance of social science with the rigour of the hard sciences, but at the cost of having to build, check and maintain whole chains of models.Building on an established collaboration between social and complexity scientists in Manchester, this project will integrate the two disciplines to produce new insights, techniques and approaches for policy makers and their advisors. However this will require both the complexity and social scientists to develop new techniques. The complexity scientists will develop new families of computer models that capture several aspects of society in one simulation, including: how the membership of different groups, origins, classes, etc. are signalled by people (e.g. the way they dress, or their attitudes); the advantages and disadvantages of belonging to several different social groups at the same time; how different but parallel social networks might relate to each other; and how the views of people on specific issues might change in response to their friends, wider group and even politicians. The social scientists will develop ways of relating these kinds of models to the rich sources of social data that are available, and will collect additional social data where these sources prove inadequate. They will also ensure that the modelling results are interpreted meaningfully and usefully, in particular in ensuring that they are not over-interpreted. By bringing together the social science evidence, the layers of simulation models and the combined expertise of the researchers this project aims to make real progress in understanding the complex, important yet sensitive issues surrounding the processes that underlie the effects of immigration and diversity on social cohesion and integration. From the beginning it will involve policy experts and decision makers to help guide the project and ensure its relevance.

    visibility28
    visibilityviews28
    downloaddownloads64
    Powered by Usage counts
    more_vert