Advanced search in
Projects
arrow_drop_down
Searching FieldsTerms
Any field
arrow_drop_down
includes
arrow_drop_down
45 Projects

  • UK Research and Innovation
  • UKRI|EPSRC
  • 2011
  • 2011

10
arrow_drop_down
  • Funder: UKRI Project Code: EP/I014012/1
    Funder Contribution: 24,220 GBP

    The area of the proposed research comprises those aspects of homological algebra and topology which use methods from other areas of mathematics such as algebraic and differential geometry as well as ideas and intuition from theoretical physics.The notions of a Maurer-Cartan element an Maurer-Cartan equation (also known in physics as a master equation) are classical and indeed, go back to the classical papers of Maurer and Cartan written over 100 years ago; this is a standard tool in differential geomtery. It has been noticed that the Maurer-Cartan structures appear in various branches of mathematics and theoretical physics, from characteristic classes of foliations to conformal field theory. This notion has been considered by many authors from different points of view but until now a general treatment has been lacking.The present proposal intends to lay the foundations for the theory of Maurer-Cartan elements and their moduli spaces from the standpoint of rational homotopy theory. The developed apparatus will be applied to deformation theory and graph homology.

    visibility1
    visibilityviews1
    downloaddownloads8
    Powered by Usage counts
    more_vert
  • Funder: UKRI Project Code: EP/I028137/1
    Funder Contribution: 156,810 GBP

    The aim is to exploit a recent discovery concerning the production of a new high activity catalyst for use in the production of formaldehyde from the oxidation of methanol using a novel nanorod catalysts. These new catalysts have been protected by a patent filing. The key feature of these catalysts is that they give higher yields that the current commercial catalysts. Funding is requested to complete patent exemplification and to ensure commercial exploitation can be achieved.

    more_vert
  • Funder: UKRI Project Code: EP/I033319/1
    Funder Contribution: 30,704 GBP

    Professor Cann will be a Visiting Scholar at the Department of Materials Science and Engineering, Sheffield for a period of 3 months from April 1st until June 30th 2011. During his study visit, he will undertake in collaboration with Professor Reaney's research in the field of PbO-free ceramics which exhibit high electromechanical strain. The main aim of the research is to combine the excellent compositional-processing-studies carried out in several promising systems at Oregon with the exceptional knowledge of crystal chemistry, structure and microstructure available within Prof. Reaney's group. In addition, Prof Cann will use his time at Sheffield to visit with other interested research groups at Leeds, Imperial, Birmingham, Liverpool and Manchester to foster further links within the UK community. He will, were appropriate, give seminars and discuss his excellent research findings at targeted UK groups who work in Electroceramics

    more_vert
  • Funder: UKRI Project Code: EP/H04339X/1
    Funder Contribution: 16,812 GBP

    This proposal seeks funding to support Professor Nagarajan Valanoor to spend 3 months (October-December 2010) as a visiting researcher in Queen's University Belfast, and work in collaboration with Professor JM Gregg's nanoscale ferroelectrics activity.Prof Nagarajan is acknowledged internationally as an expert in nanoscale ferroelectrics, with particular expertise in thin film growth, nanoscale patterning and domain imaging using piezo-force microscopy (PFM). His knowledge, experience and interests are strongly aligned with those of the nanoscale ferroelectrics group in Queen's University Belfast (QUB), and his stay in QUB should be extremely useful both in terms of helping to accelerate progress in ongoing research, and in terms of performing preliminary work for potential future collaborative programmes. During his visit, we wish to pursue three themes of common interest:(i) PFM imaging of domain configurations in ferroelectric nanoshapes (contributing to a current EPSRC-funded programme: EP/F004869/1 Investigating the fabrication and dipole characteristics of complex ferroelectric nanoshapes );(ii) Modification of the PFM system at QUB to allow dynamical studies of switching in ferroelectric nanoshapes;(iii) Preliminary exploration of potential flexoelectric polarisation nanodevices.

    more_vert
  • Funder: UKRI Project Code: EP/J004758/1
    Funder Contribution: 31,918 GBP

    In 2005, the Vehicle and Operator Services Agency (VOSA) introduced a computerised system for reporting MOT (roadworthiness) test results. Since that time, the results of approximately 35,000,000 MOT tests annually have been collected and stored in a Department for Transport (DfT) database. The DfT business plan , published 8 November 2010, promised to make available the "detailed VOSA MOT data" - and on 24 November, comprehensive data was released - consisting of the results of 150,000,000 MOT tests from 2005 to the spring of 2010. Some fields, such as vehicle registration plates and unique VTS (vehicle test station) identities have been withheld from the published data in order to preserve anonymity. However, what remains still contains a wealth of information that is not available in any other data set. In addition to the results of the MOT test itself (including detailed reasons for failure), the data include: - the vehicle odometer (mileage) reading - the vehicle manufacturer, type and engine capacity - the vehicle's year of first use - the top-level postal area (letters only from the postcode) of the VTS Our initial objective is to use the vehicle odometer readings - which are not available in any other (large scale) data set - combined with the data about vehicle type, to analyse how patterns of vehicle usage (and associated carbon footprint) have changed with time, disaggregated over different regions of the country. The project will therefore aim: - to develop software tools for the analysis of the MOT data; - to work with the DfT and VOSA on maximizing the use that can be made of the MOT data set whilst respecting issues such as data protection; - to scope the application of MOT odometer readings and the possibilities for triangulating with other data sets (such as vehicle emissions, new vehicle registrations and Census data); - to develop one (or two) small-scale demonstrations illustrating potential applications of our approach. The ultimate aim, going beyond the scoping study, is to create a publicly available tool that all those undertaking travel behavior change initiatives could use to assess the impacts of their work on car ownership, use and related carbon emissions, thereby dramatically reducing the need for every individual project to commission surveys or other forms of travel behavior measurement. Further research could also include specific analyses of: changes in car ownership and use that have occurred in the Sustainable Travel and Cycling Demonstration Towns; the nature of the distribution and diffusion of electric, hybrid and other alternative-technology vehicles; the location and concentration of 'dirty' vehicle use with implications for the targeting of climate change and air quality initiatives; and the relationship between car use and physical activity.

    visibility5
    visibilityviews5
    downloaddownloads40
    Powered by Usage counts
    more_vert
  • Funder: UKRI Project Code: EP/I034092/1
    Funder Contribution: 71,677 GBP

    When you plug your fridge into the mains electricity supply you don't worry about all the technology sitting behind the wall socket -- it just works. Cloud computing is starting to supply IT in a similar fashion. No more worrying about backups, no more hours spent configuring a new or repaired machine -- just plug into the network, fire up your web browser and away you go.Researchers have tougher and more specialised IT needs than most, so to realise the same ease of use that the cloud now provides for email or word processing requires work in several areas. One of these areas is to adapt existing established research tools to the cloud, and that is what this project will do. Our tool is called GATE, a General Architecture for Text Engineering. Over the last decade the UK's GATE system has become a world-leader for research and development of text mining algorithms.Text has become a more and more important communication method in recent decades. Our children are now spending over 6 hours in front of screens; our evenings often include sessions on Facebook or writing email to friends and relatives. When we interact with the corporations and governmental organisations whose infrastructure and services underpin our daily lives, we fill in forms or write emails. When we want to publicise our work or share details of our leisure activities we create websites, post Twitter messages or blog entries. Scientists also now use these channels in their work, in addition to publishing in peer-reviewed journals -- a process which has also seen a huge expansion in recent years.This avalanche of the written word has changed many things, not least the way that scientists gather information. For example, a team at the World Health Organisation's cancer research agency recently found the first evidence of a link between particular genetic mutation and the risk of lung cancer in smokers. Their experiments require large amounts of costly laboratory time to test hypotheses, based on samples of mutations in gene sequences from their test subjects. Text mining from previous publications makes it possible for them to reduce this lab time by factoring in probabilities based on association strengths between mutations, environmental factors and active chemicals.A second area that has been revolutionised by new media is customer relations and market research, which are no longer about monitoring the goings on of the corporate call centre. Keeping up to date with the public image of your products or services now means coping with the Twitter firehose (45 million posts per day), the comment sections of consumer review sites, or the point-and-click 'contact us' forms from the company website. To do this by hand is now impossible in the general case: the data volume long ago outstripped the possibility of cost-effective manual monitoring. Text mining provides alternative, automatic methods for dealing with text.GATE provides four systems to support scientists experimenting with new text mining algorithms and developers using text mining in their applications:- GATE Developer: an integrated development environment for language processing components- GATE Embedded: an object library optimised for inclusion in diverse applications- GATE Teamware: a collaborative annotation environment for high volume web-based semantic annotation projects built around a workflow engine- GATE Mmir: (Multi-paradigm Information Management Index and Repository) a massively scaleable multi-paradigm indexWe have identified a need for a particular type of cloud service in our research field and this project will implement it such that there is close to zero barrier to entry for researchers. Based on our preliminary investigative work, we expect to complete a production quality service within this project. In simpler terms - this project will work towards making use of GATE on the cloud more like electric sockets and fridges!

    visibility9
    visibilityviews9
    downloaddownloads3
    Powered by Usage counts
    more_vert
  • Funder: UKRI Project Code: EP/I029370/1
    Funder Contribution: 102,164 GBP

    This project will use the inherent properties of transition metal nitrides (TMNs) as the basis for developing new generation supercapacitors that deliver high energy and power densities at low cost.The continual increase in energy demands, coupled with a limited supply of fossil fuels is driving the need for adoption of renewable energy sources. Concerns over CO2 emissions and associated climate change impacts are also spurring technology efforts in order to make hybrid and electric vehicles widely available. Energy storage is a key issue that needs to be addressed within both these scenarios and supercapacitors will play a vital role.To meet future energy demands new generation supercapacitors must increase their energy densities at least two-fold over current commercially available devices, while maintaining response times of less than one second. They must also be low cost.We will use hard templating and novel microwave assisted synthesis routes to create structured electrode materials based on TMNs, addressing the key electrode features for supporting good electronic conductivity, good electrolyte mobility and plenty of surface area to increase the total charge-storage capabilities of the supercapacitor. State-of-the-art electron microscopy techniques will enable us to establish the critical link between material structure and performance.The success of this project will initiate a step-change in current research directions, basing new developments on high performing, low cost materials. These developments will see supercapacitors supporting upcoming technological developments including use in hybrid-electric vehicles, new portable electronic devices and in delivery grid systems which are supported by renewable energy sources.

    more_vert
  • Funder: UKRI Project Code: EP/I01456X/1
    Funder Contribution: 13,409 GBP

    Program logics play an important role in Computer Science to complement testing. A program logic allows one to prove that a program satisfies a given specification. Seminal work has been done in the early seventies by Hoare on axiomatic semantics for stateful programs. Since then many calculi have been developed for all kinds of programming languages and meanwhile mechanizations of these logics in numerous verification tools exist. Two properties of a program logic are of particular interest: Soundness states that any property one can prove of a program in the calculus is actually valid. Completeness states the converse, namely that any valid property can also be derived. In an ideal world, a formal calculus for a program logic would be both, sound and complete, thus faithfully and completely reflecting the semantics of programs and correctness assertions, also called specifications. However, due to Goedel's Incompleteness Theorem it is hopeless to look for (absolutely) complete program logics since for any formal system S there always exists a correctness assertion which is true but cannot be proved in S.In spite of this, one might ask whether the axioms of some program logic are sufficient to derive all true correctness assertions relative to some complete theory of data as e.g. all true sentences of first order arithmetic. This was first investigated for simple imperative languages where specifications are so-called Hoare triples, of the form {P}C{Q} where C is a program, P the pre-condition, and Q the post-condition. Such a triple states that if C is run in a state fulfilling P and terminates, the resulting new state will meet assertion Q. The Hoare-calculus then provides a set of rules and axioms how one can derive such triples, ie. proofs that programs meet a certain specification given by pre- and post-conditions.The property of relative completeness for such a logic was established by Cook in his seminal paper for a simple variant of Hoare logic. He showed that all correct partial correctness assertions of the form {P}C{Q} can be derived using the rules of Hoare's logic provided we are allowed to use all true sentences of first order arithmetic as axioms. The reason for this is that the language of first order arithmetic is strong enough to express for all programs P its input/output relation by a formula of first order arithmetic.Program logics, however, are also of interest for functional programs. Popular functional programming languages are ML, Caml, or Haskell. Pure functional languages do not use state but recursively defined data structures and higher-order functions on them. To the best of our knowledge the question whether relative completeness holds for logics of functional programming languages has not been investigated systematically and thoroughly. Therefore, this project will investigate logics such as D. Scott's LCF (or extensions of it). Experience tells us that verification of most purely functional programs can be expressed within LCF. But it is also easy to find assertions which can neither be proved nor disproved within LCF, like the specification of 'parallel or'. The reason simply is that the former holds in the Scott model but its negation holds in the fully abstract model. It is important to note that these two models are not different w.r.t. the data type NAT of natural numbers (and also the data type NAT->NAT of unary functions on NAT) but they do differ at higher types. Accordingly, it does not make sense to ask whether LCF is relatively complete w.r.t. to a full axiomatization of its first order part since the latter -- unlike for a basic imperative language -- does not fully determine the (higher type part of the) model.Thus, the right question, the one we will tackle in this project, is whether 'natural' models for PCF can have nice complete axiomatizations.

    more_vert
  • Funder: UKRI Project Code: EP/F063245/2
    Funder Contribution: 38,462 GBP

    Over the past two decades, managers have made major improvements in the efficiency of supply chains, driving out costs by sourcing goods and services from low cost locations, using new technologies to create greater integration and visibility, reducing the number of suppliers in their supply bases, and outsourcing non value adding activities. Unfortunately, while efficient supply chain design works well when the environment is stable and predictable, it also creates vulnerabilities when the environment becomes volatile and uncertain. Arguably, the current business environment typifies the latter and threats to business continuity have never been higher. Indeed, trends indicate that over the past thirty years the number of natural disasters has increased by a factor of five at the same time as technological disasters rose by a factor of eleven [1]. This project seeks to examine how supply chain design effects vulnerability. The underlying principle is that good design could balance both efficiencies and flexibility to disruptions. For example, when lightening wiped out a Philips manufacturing facility that supplied radio frequency chips (RFCs) to both Nokia and Ericsson their reactions, and subsequent performance, were very different. Nokia quickly set about pressuring Philips for alternative sources of supply while simultaneously redesigning the component for other suppliers. Ericsson, on the other hand, was extremely slow to detect the problem and although the design of its supply chain was very efficient it was not sufficiently flexible to change the source of supply. The results are telling. Nokia went on to meet its production targets and increase market share from 27% to 30% while Ericsson posted a $1.7 billion loss and ultimately had to outsource handset production to another company [2]. Similarly, the terrorist attacks of September 11th 2001 created a significant threat to business continuity across the globe, but whereas Ford had to shut its plants for five days, Chrysler used alternative logistic routes to ensure that supply continued [3]. Both examples clearly demonstrate the potential of good design for reducing the impact of disruptions.This work seeks to inform and assist managerial practices by advancing understanding of how supply chain design characteristics affect vulnerability. In doing so, the output will be a rigorous and relevant framework supporting UK firms to identify and prioritise sources of supply chain vulnerability. The framework will be designed around actionable supply chain design variables, such as sourcing strategies and inventory levels, with the ultimate objective of reducing vulnerabilities while maintaining levels of efficiency. The research will draw research support from both manufacturing and service industries, and public and private sectors to ensure that the framework can be tailored to context specific characteristics.[1] Hoyois, P., Scheuren, J-M., Bleow, R., & Guha-Sapir, D. (2007). Annual disaster statistical review: Numbers and trends 2006, CRED: Brussels.[2] Sheffi, Y. (2005). The Resilient Enterprise. Cambridge, MA: The MIT Press.[3] Griffy-Brown, C. (2003). Just-In-Time to Just-In-Case. Graziadio Business Report, 6(2).

    more_vert
  • Funder: UKRI Project Code: EP/I034327/1
    Funder Contribution: 72,282 GBP

    This proposal is focused on enabling researchers to simply and rapidly deploy, execute and monitor scientific software on elastic cloud computing infrastructures. Current interfaces to cloud resources are relatively low level and do not allow researchers to easily benefit from the elasticity that cloud infrastructures offer. Researchers have to deal with time-consuming and often error-prone tasks such as managing access credentials, selecting instance types, managing elastic IP addresses, as well as monitoring resource usage and starting, stopping and terminating instances in response; this keeps researchers from focusing directly on their scientific research.In order to address this problem and to further the uptake of cloud computing services in research we will develop an elastic wrapper for scientific applications. The elastic wrapper will provide an abstracted gateway to cloud resources and will provide a one-stop-shop interface for researchers wanting to take advantage of cloud resources for their scientific research. It will abstract the complexities of setting up, configuring and managing cloud resources for scientific research applications and provide facilities for execution and collaboration between multiple research sites working on the same problem. The system will take care of issues such as managing resource usage using the elasticity of cloud resources as well as fault tolerance to insure against resource failure. This project will provide a pilot implementation of the elastic wrapper that will be a generic solution but specifically support two exemplar scientific applications and their usage models: Groups, Algorithms, and Programming (GAP), a free, open source system for discrete computational algebra with an emphasis on computational group theory and IDL is a commercial package for statistical and numerical analysis and visualization of scientific datasets.

    visibility2
    visibilityviews2
    downloaddownloads3
    Powered by Usage counts
    more_vert
Advanced search in
Projects
arrow_drop_down
Searching FieldsTerms
Any field
arrow_drop_down
includes
arrow_drop_down
45 Projects
  • Funder: UKRI Project Code: EP/I014012/1
    Funder Contribution: 24,220 GBP

    The area of the proposed research comprises those aspects of homological algebra and topology which use methods from other areas of mathematics such as algebraic and differential geometry as well as ideas and intuition from theoretical physics.The notions of a Maurer-Cartan element an Maurer-Cartan equation (also known in physics as a master equation) are classical and indeed, go back to the classical papers of Maurer and Cartan written over 100 years ago; this is a standard tool in differential geomtery. It has been noticed that the Maurer-Cartan structures appear in various branches of mathematics and theoretical physics, from characteristic classes of foliations to conformal field theory. This notion has been considered by many authors from different points of view but until now a general treatment has been lacking.The present proposal intends to lay the foundations for the theory of Maurer-Cartan elements and their moduli spaces from the standpoint of rational homotopy theory. The developed apparatus will be applied to deformation theory and graph homology.

    visibility1
    visibilityviews1
    downloaddownloads8
    Powered by Usage counts
    more_vert
  • Funder: UKRI Project Code: EP/I028137/1
    Funder Contribution: 156,810 GBP

    The aim is to exploit a recent discovery concerning the production of a new high activity catalyst for use in the production of formaldehyde from the oxidation of methanol using a novel nanorod catalysts. These new catalysts have been protected by a patent filing. The key feature of these catalysts is that they give higher yields that the current commercial catalysts. Funding is requested to complete patent exemplification and to ensure commercial exploitation can be achieved.

    more_vert
  • Funder: UKRI Project Code: EP/I033319/1
    Funder Contribution: 30,704 GBP

    Professor Cann will be a Visiting Scholar at the Department of Materials Science and Engineering, Sheffield for a period of 3 months from April 1st until June 30th 2011. During his study visit, he will undertake in collaboration with Professor Reaney's research in the field of PbO-free ceramics which exhibit high electromechanical strain. The main aim of the research is to combine the excellent compositional-processing-studies carried out in several promising systems at Oregon with the exceptional knowledge of crystal chemistry, structure and microstructure available within Prof. Reaney's group. In addition, Prof Cann will use his time at Sheffield to visit with other interested research groups at Leeds, Imperial, Birmingham, Liverpool and Manchester to foster further links within the UK community. He will, were appropriate, give seminars and discuss his excellent research findings at targeted UK groups who work in Electroceramics

    more_vert
  • Funder: UKRI Project Code: EP/H04339X/1
    Funder Contribution: 16,812 GBP

    This proposal seeks funding to support Professor Nagarajan Valanoor to spend 3 months (October-December 2010) as a visiting researcher in Queen's University Belfast, and work in collaboration with Professor JM Gregg's nanoscale ferroelectrics activity.Prof Nagarajan is acknowledged internationally as an expert in nanoscale ferroelectrics, with particular expertise in thin film growth, nanoscale patterning and domain imaging using piezo-force microscopy (PFM). His knowledge, experience and interests are strongly aligned with those of the nanoscale ferroelectrics group in Queen's University Belfast (QUB), and his stay in QUB should be extremely useful both in terms of helping to accelerate progress in ongoing research, and in terms of performing preliminary work for potential future collaborative programmes. During his visit, we wish to pursue three themes of common interest:(i) PFM imaging of domain configurations in ferroelectric nanoshapes (contributing to a current EPSRC-funded programme: EP/F004869/1 Investigating the fabrication and dipole characteristics of complex ferroelectric nanoshapes );(ii) Modification of the PFM system at QUB to allow dynamical studies of switching in ferroelectric nanoshapes;(iii) Preliminary exploration of potential flexoelectric polarisation nanodevices.

    more_vert
  • Funder: UKRI Project Code: EP/J004758/1
    Funder Contribution: 31,918 GBP

    In 2005, the Vehicle and Operator Services Agency (VOSA) introduced a computerised system for reporting MOT (roadworthiness) test results. Since that time, the results of approximately 35,000,000 MOT tests annually have been collected and stored in a Department for Transport (DfT) database. The DfT business plan , published 8 November 2010, promised to make available the "detailed VOSA MOT data" - and on 24 November, comprehensive data was released - consisting of the results of 150,000,000 MOT tests from 2005 to the spring of 2010. Some fields, such as vehicle registration plates and unique VTS (vehicle test station) identities have been withheld from the published data in order to preserve anonymity. However, what remains still contains a wealth of information that is not available in any other data set. In addition to the results of the MOT test itself (including detailed reasons for failure), the data include: - the vehicle odometer (mileage) reading - the vehicle manufacturer, type and engine capacity - the vehicle's year of first use - the top-level postal area (letters only from the postcode) of the VTS Our initial objective is to use the vehicle odometer readings - which are not available in any other (large scale) data set - combined with the data about vehicle type, to analyse how patterns of vehicle usage (and associated carbon footprint) have changed with time, disaggregated over different regions of the country. The project will therefore aim: - to develop software tools for the analysis of the MOT data; - to work with the DfT and VOSA on maximizing the use that can be made of the MOT data set whilst respecting issues such as data protection; - to scope the application of MOT odometer readings and the possibilities for triangulating with other data sets (such as vehicle emissions, new vehicle registrations and Census data); - to develop one (or two) small-scale demonstrations illustrating potential applications of our approach. The ultimate aim, going beyond the scoping study, is to create a publicly available tool that all those undertaking travel behavior change initiatives could use to assess the impacts of their work on car ownership, use and related carbon emissions, thereby dramatically reducing the need for every individual project to commission surveys or other forms of travel behavior measurement. Further research could also include specific analyses of: changes in car ownership and use that have occurred in the Sustainable Travel and Cycling Demonstration Towns; the nature of the distribution and diffusion of electric, hybrid and other alternative-technology vehicles; the location and concentration of 'dirty' vehicle use with implications for the targeting of climate change and air quality initiatives; and the relationship between car use and physical activity.

    visibility5
    visibilityviews5
    downloaddownloads40
    Powered by Usage counts
    more_vert
  • Funder: UKRI Project Code: EP/I034092/1
    Funder Contribution: 71,677 GBP

    When you plug your fridge into the mains electricity supply you don't worry about all the technology sitting behind the wall socket -- it just works. Cloud computing is starting to supply IT in a similar fashion. No more worrying about backups, no more hours spent configuring a new or repaired machine -- just plug into the network, fire up your web browser and away you go.Researchers have tougher and more specialised IT needs than most, so to realise the same ease of use that the cloud now provides for email or word processing requires work in several areas. One of these areas is to adapt existing established research tools to the cloud, and that is what this project will do. Our tool is called GATE, a General Architecture for Text Engineering. Over the last decade the UK's GATE system has become a world-leader for research and development of text mining algorithms.Text has become a more and more important communication method in recent decades. Our children are now spending over 6 hours in front of screens; our evenings often include sessions on Facebook or writing email to friends and relatives. When we interact with the corporations and governmental organisations whose infrastructure and services underpin our daily lives, we fill in forms or write emails. When we want to publicise our work or share details of our leisure activities we create websites, post Twitter messages or blog entries. Scientists also now use these channels in their work, in addition to publishing in peer-reviewed journals -- a process which has also seen a huge expansion in recent years.This avalanche of the written word has changed many things, not least the way that scientists gather information. For example, a team at the World Health Organisation's cancer research agency recently found the first evidence of a link between particular genetic mutation and the risk of lung cancer in smokers. Their experiments require large amounts of costly laboratory time to test hypotheses, based on samples of mutations in gene sequences from their test subjects. Text mining from previous publications makes it possible for them to reduce this lab time by factoring in probabilities based on association strengths between mutations, environmental factors and active chemicals.A second area that has been revolutionised by new media is customer relations and market research, which are no longer about monitoring the goings on of the corporate call centre. Keeping up to date with the public image of your products or services now means coping with the Twitter firehose (45 million posts per day), the comment sections of consumer review sites, or the point-and-click 'contact us' forms from the company website. To do this by hand is now impossible in the general case: the data volume long ago outstripped the possibility of cost-effective manual monitoring. Text mining provides alternative, automatic methods for dealing with text.GATE provides four systems to support scientists experimenting with new text mining algorithms and developers using text mining in their applications:- GATE Developer: an integrated development environment for language processing components- GATE Embedded: an object library optimised for inclusion in diverse applications- GATE Teamware: a collaborative annotation environment for high volume web-based semantic annotation projects built around a workflow engine- GATE Mmir: (Multi-paradigm Information Management Index and Repository) a massively scaleable multi-paradigm indexWe have identified a need for a particular type of cloud service in our research field and this project will implement it such that there is close to zero barrier to entry for researchers. Based on our preliminary investigative work, we expect to complete a production quality service within this project. In simpler terms - this project will work towards making use of GATE on the cloud more like electric sockets and fridges!

    visibility9
    visibilityviews9
    downloaddownloads3
    Powered by Usage counts
    more_vert
  • Funder: UKRI Project Code: EP/I029370/1
    Funder Contribution: 102,164 GBP

    This project will use the inherent properties of transition metal nitrides (TMNs) as the basis for developing new generation supercapacitors that deliver high energy and power densities at low cost.The continual increase in energy demands, coupled with a limited supply of fossil fuels is driving the need for adoption of renewable energy sources. Concerns over CO2 emissions and associated climate change impacts are also spurring technology efforts in order to make hybrid and electric vehicles widely available. Energy storage is a key issue that needs to be addressed within both these scenarios and supercapacitors will play a vital role.To meet future energy demands new generation supercapacitors must increase their energy densities at least two-fold over current commercially available devices, while maintaining response times of less than one second. They must also be low cost.We will use hard templating and novel microwave assisted synthesis routes to create structured electrode materials based on TMNs, addressing the key electrode features for supporting good electronic conductivity, good electrolyte mobility and plenty of surface area to increase the total charge-storage capabilities of the supercapacitor. State-of-the-art electron microscopy techniques will enable us to establish the critical link between material structure and performance.The success of this project will initiate a step-change in current research directions, basing new developments on high performing, low cost materials. These developments will see supercapacitors supporting upcoming technological developments including use in hybrid-electric vehicles, new portable electronic devices and in delivery grid systems which are supported by renewable energy sources.

    more_vert
  • Funder: UKRI Project Code: EP/I01456X/1
    Funder Contribution: 13,409 GBP

    Program logics play an important role in Computer Science to complement testing. A program logic allows one to prove that a program satisfies a given specification. Seminal work has been done in the early seventies by Hoare on axiomatic semantics for stateful programs. Since then many calculi have been developed for all kinds of programming languages and meanwhile mechanizations of these logics in numerous verification tools exist. Two properties of a program logic are of particular interest: Soundness states that any property one can prove of a program in the calculus is actually valid. Completeness states the converse, namely that any valid property can also be derived. In an ideal world, a formal calculus for a program logic would be both, sound and complete, thus faithfully and completely reflecting the semantics of programs and correctness assertions, also called specifications. However, due to Goedel's Incompleteness Theorem it is hopeless to look for (absolutely) complete program logics since for any formal system S there always exists a correctness assertion which is true but cannot be proved in S.In spite of this, one might ask whether the axioms of some program logic are sufficient to derive all true correctness assertions relative to some complete theory of data as e.g. all true sentences of first order arithmetic. This was first investigated for simple imperative languages where specifications are so-called Hoare triples, of the form {P}C{Q} where C is a program, P the pre-condition, and Q the post-condition. Such a triple states that if C is run in a state fulfilling P and terminates, the resulting new state will meet assertion Q. The Hoare-calculus then provides a set of rules and axioms how one can derive such triples, ie. proofs that programs meet a certain specification given by pre- and post-conditions.The property of relative completeness for such a logic was established by Cook in his seminal paper for a simple variant of Hoare logic. He showed that all correct partial correctness assertions of the form {P}C{Q} can be derived using the rules of Hoare's logic provided we are allowed to use all true sentences of first order arithmetic as axioms. The reason for this is that the language of first order arithmetic is strong enough to express for all programs P its input/output relation by a formula of first order arithmetic.Program logics, however, are also of interest for functional programs. Popular functional programming languages are ML, Caml, or Haskell. Pure functional languages do not use state but recursively defined data structures and higher-order functions on them. To the best of our knowledge the question whether relative completeness holds for logics of functional programming languages has not been investigated systematically and thoroughly. Therefore, this project will investigate logics such as D. Scott's LCF (or extensions of it). Experience tells us that verification of most purely functional programs can be expressed within LCF. But it is also easy to find assertions which can neither be proved nor disproved within LCF, like the specification of 'parallel or'. The reason simply is that the former holds in the Scott model but its negation holds in the fully abstract model. It is important to note that these two models are not different w.r.t. the data type NAT of natural numbers (and also the data type NAT->NAT of unary functions on NAT) but they do differ at higher types. Accordingly, it does not make sense to ask whether LCF is relatively complete w.r.t. to a full axiomatization of its first order part since the latter -- unlike for a basic imperative language -- does not fully determine the (higher type part of the) model.Thus, the right question, the one we will tackle in this project, is whether 'natural' models for PCF can have nice complete axiomatizations.

    more_vert
  • Funder: UKRI Project Code: EP/F063245/2
    Funder Contribution: 38,462 GBP

    Over the past two decades, managers have made major improvements in the efficiency of supply chains, driving out costs by sourcing goods and services from low cost locations, using new technologies to create greater integration and visibility, reducing the number of suppliers in their supply bases, and outsourcing non value adding activities. Unfortunately, while efficient supply chain design works well when the environment is stable and predictable, it also creates vulnerabilities when the environment becomes volatile and uncertain. Arguably, the current business environment typifies the latter and threats to business continuity have never been higher. Indeed, trends indicate that over the past thirty years the number of natural disasters has increased by a factor of five at the same time as technological disasters rose by a factor of eleven [1]. This project seeks to examine how supply chain design effects vulnerability. The underlying principle is that good design could balance both efficiencies and flexibility to disruptions. For example, when lightening wiped out a Philips manufacturing facility that supplied radio frequency chips (RFCs) to both Nokia and Ericsson their reactions, and subsequent performance, were very different. Nokia quickly set about pressuring Philips for alternative sources of supply while simultaneously redesigning the component for other suppliers. Ericsson, on the other hand, was extremely slow to detect the problem and although the design of its supply chain was very efficient it was not sufficiently flexible to change the source of supply. The results are telling. Nokia went on to meet its production targets and increase market share from 27% to 30% while Ericsson posted a $1.7 billion loss and ultimately had to outsource handset production to another company [2]. Similarly, the terrorist attacks of September 11th 2001 created a significant threat to business continuity across the globe, but whereas Ford had to shut its plants for five days, Chrysler used alternative logistic routes to ensure that supply continued [3]. Both examples clearly demonstrate the potential of good design for reducing the impact of disruptions.This work seeks to inform and assist managerial practices by advancing understanding of how supply chain design characteristics affect vulnerability. In doing so, the output will be a rigorous and relevant framework supporting UK firms to identify and prioritise sources of supply chain vulnerability. The framework will be designed around actionable supply chain design variables, such as sourcing strategies and inventory levels, with the ultimate objective of reducing vulnerabilities while maintaining levels of efficiency. The research will draw research support from both manufacturing and service industries, and public and private sectors to ensure that the framework can be tailored to context specific characteristics.[1] Hoyois, P., Scheuren, J-M., Bleow, R., & Guha-Sapir, D. (2007). Annual disaster statistical review: Numbers and trends 2006, CRED: Brussels.[2] Sheffi, Y. (2005). The Resilient Enterprise. Cambridge, MA: The MIT Press.[3] Griffy-Brown, C. (2003). Just-In-Time to Just-In-Case. Graziadio Business Report, 6(2).

    more_vert
  • Funder: UKRI Project Code: EP/I034327/1
    Funder Contribution: 72,282 GBP

    This proposal is focused on enabling researchers to simply and rapidly deploy, execute and monitor scientific software on elastic cloud computing infrastructures. Current interfaces to cloud resources are relatively low level and do not allow researchers to easily benefit from the elasticity that cloud infrastructures offer. Researchers have to deal with time-consuming and often error-prone tasks such as managing access credentials, selecting instance types, managing elastic IP addresses, as well as monitoring resource usage and starting, stopping and terminating instances in response; this keeps researchers from focusing directly on their scientific research.In order to address this problem and to further the uptake of cloud computing services in research we will develop an elastic wrapper for scientific applications. The elastic wrapper will provide an abstracted gateway to cloud resources and will provide a one-stop-shop interface for researchers wanting to take advantage of cloud resources for their scientific research. It will abstract the complexities of setting up, configuring and managing cloud resources for scientific research applications and provide facilities for execution and collaboration between multiple research sites working on the same problem. The system will take care of issues such as managing resource usage using the elasticity of cloud resources as well as fault tolerance to insure against resource failure. This project will provide a pilot implementation of the elastic wrapper that will be a generic solution but specifically support two exemplar scientific applications and their usage models: Groups, Algorithms, and Programming (GAP), a free, open source system for discrete computational algebra with an emphasis on computational group theory and IDL is a commercial package for statistical and numerical analysis and visualization of scientific datasets.

    visibility2
    visibilityviews2
    downloaddownloads3
    Powered by Usage counts
    more_vert