
Developing countries are the site of most landscape burning worldwide. They burn the most peatland and forest, have the highest deforestation and net fire-related greenhouse gas emissions, squander economic opportunity by burning agricultural residues, have infrastructure such as power lines and resources such as forest plantations and protected areas at risk from fire, and experience the most recurrent and worst air pollution events associated with landscape burning. Atmospheric impacts spread far beyond national borders, making this a regional problem through the spread of pollution, and a global problem through impacts on climate from carbon emissions. Billions of dollars have been spent on the ground- and space-based infrastructure necessary to provide real-time, continuous remote sensing observations that support meteorological forecasts worldwide. Our Project will harness this infrastructure in order to benefit developing country users who, because of the above issues, require accurate, actionable, and extremely up-to-date information on the location and characteristics of wildfires in their area of interest, and on the smoke that these fires are releasing into the atmosphere. Our project will make available real-time, accurate and actionable information on landscape fires and fire emissions through a combination of work by the UK team and our overseas partners. This new information will cover dozens of DAC-list countries in the tropics and sub-tropics that experience significant challenges from landscape burning, and so the benefits will be regional throughout the tropics and sub-tropics, rather than to only a few nations. We will use a source of new continuous and real-time (10 to 15 minute update frequency) meteorological satellite data to provide this real-time intelligence on wildfire state, exploiting algorithms developed under NERC funded research and working with Partners (IPMA, Portugal and UNAM, Mexico) who will implement these algorithms in their own satellite data processing chains to provide 24-hr guaranteed (99%) information availability on landscape fires. The resulting real-time wildfire information will be made available to users in all developing nations through the already extremely widely used Advanced Fire Information System (AFIS) run by our Partner CSIR, South Africa, who have tens of thousands of users already and as a result of this new pan-tropical information will greatly extend their reach throughout the tropics since this highest temporal resolution data is currently only available at the highest quality over North and West Africa. Our project will also provide the information required to turn the real-time fire information into real-time estimates of fire emissions - particularly focusing on health-impacting particulate matter and total carbon emissions, which will benefit developing country users who are focused on health-impacting particulate and GHG emissions assessments and the national programmes aimed at their reduction. Overall our project will provide a real step-change in actionable fire information available in the developing countries of South and Southeast Asia, Southern and East Africa, Mexico, Central and South America. Institutions and individuals in these regions will be able to identify fires burning close to power lines and/or other important critical infrastructure in order to take action (e.g. temporarily turn off the power line letting the fire pass underneath without problems), that have started within or close to forest reserves, plantations or protected areas (with the potential to dispatch fire response crews in a far more timely manner than currently), and which are impacting health and national GHG emissions (with information now available to better quantify these, ultimately in support at efforts to reduce them and thus gain through health improvements and/or REDD+ schemes. Keywords: Wildfires, smoke, satellites, infrastructure and area protection.
The role of external drivers of climate change in mid-latitude weather events, particularly that of human influence on climate, arouses intense scientific, policy and public interest. In February 2014, the UK Prime Minister stated he "suspected a link" between the flooding at the time and anthropogenic climate change, but the scientific community was, and remains, frustratingly unable to provide a more quantitative assessment. Quantifying the role of climate change in extreme weather events has financial significance as well: at present, impact-relevant climate change will be primarily felt through changes in extreme events. While slow-onset processes can exacerbate (or ameliorate) the impact of individual weather events, any change in the probability of occurrence of these events themselves could overwhelm this effect. While this is known to be a problem, very little is known about the magnitude of such changes in occurrence probabilities, an important knowledge gap this project aims to address. The 2015 Paris Agreement of the UNFCCC has given renewed urgency to understanding relatively subtle changes in extreme weather through its call for research into the impacts of a 1.5oC versus 2oC increase in global temperatures, to contribute to an IPCC Special Report in 2018. Few, if any, mid-latitude weather events can be unambiguously attributed to external climate drivers in the sense that these events would not have happened at all without those drivers. Hence any comprehensive assessment of the cost of anthropogenic climate change and different levels of warming in the future must quantify the impact of changing risks of extreme weather, including subtle changes in the risks of relatively 'ordinary' events. The potential, and significance, of human influence on climate affecting the occupancy of the dynamical regimes that give rise to extreme weather in mid-latitudes has long been noted, but only recently have the first tentative reports of an attributable change in regime occupancy begun to emerge. A recent example is the 2014 floods in the Southern UK, which are thought to have occurred not because of individually heavy downpours, but because of a more persistent jet. Quantifying such changes presents a challenge because high atmospheric resolution is required for realistic simulation of the processes that give rise to weather regimes, while large ensembles are required to quantify subtle but potentially important changes in regime occupancy statistics and event frequency. Under this project we propose, for the first time, to apply a well-established large-ensemble methodology that allows explicit simulation of changing event probabilities to a global seasonal-forecast-resolution model. We aim to answer the following question: over Europe, does the dynamical response to human influence on climate, manifest through changing occupancy of circulation regimes and event frequency, exacerbate or counteract the thermodynamic response, which is primarily manifest through increased available moisture and energy in individual events? Our focus is on comparing present-day conditions with the counterfactual "world that might have been" without human influence on climate, and comparing 1.5 degree and 2 degree future scenarios. While higher forcing provides higher signal-to-noise, interpretation is complicated by changing drivers and the potential for a non-linear response. We compensate for a lower signal with unprecedentedly large ensembles. Event attribution has been recognised by the WCRP as a key component of any comprehensive package of climate services. NERC science has been instrumental in its development so far: this project will provide a long-overdue integration of attribution research into the broader agenda of understanding the dynamics of mid-latitude weather.
The surface ocean is home to billions of microscopic plants called phytoplankton which produce organic matter in the surface ocean using sunlight and carbon dioxide. When they die they sink, taking this carbon into the deep ocean, where it is stored on timescales of hundreds to thousands of years, which helps keep our climate the way it is today. The size of the effect they have on our climate is linked to how deep they sink before they dissolve - the deeper they sink, the more carbon is stored. This sinking carbon also provides food to the animals living in the ocean's deep, dark 'twilight zone'. Computer models can help us predict how future changes in greenhouse gas emissions might change this ocean carbon store. Current models however struggle with making these predictions. This is partly because until recently we haven't even been able to answer the basic question 'Is there enough food for all the animals living in the twilight zone?'. But in a breakthrough this year we used new technology and new theory to show that there is indeed enough food. So now we can move on to asking what controls how deep the carbon sinks. There are lots of factors which might affect how deep the material sinks but at the moment we can't be sure which ones are important. In this project we will make oceanographic expeditions to two different places to test how these different factors affect carbon storage in the deep ocean. We will measure the carbon sinking into the twilight zone and the biological processes going on within it. Then we will determine if the systems are balanced - in other words, what goes in, should come out again. We will then write equations linking all the parts of the system together and analyse them to make them more simple. At the same time we will test whether the simple equations are still useful by seeing if they produce good global maps of ocean properties for which we have lots of data. Finally, when we are happy that our new equations are doing a good job we will use them in a computer model to predict the future store of carbon in the ocean.
The surface ocean is home to billions of microscopic plants called phytoplankton which produce organic matter in the surface ocean using sunlight and carbon dioxide. When they die they sink, taking this carbon into the deep ocean, where it is stored on timescales of hundreds to thousands of years, which helps keep our climate the way it is today. The size of the effect they have on our climate is linked to how deep they sink before they dissolve - the deeper they sink, the more carbon is stored. This sinking carbon also provides food to the animals living in the ocean's deep, dark 'twilight zone'. Computer models can help us predict how future changes in greenhouse gas emissions might change this ocean carbon store. Current models however struggle with making these predictions. This is partly because until recently we haven't even been able to answer the basic question 'Is there enough food for all the animals living in the twilight zone?'. But in a breakthrough this year we used new technology and new theory to show that there is indeed enough food. So now we can move on to asking what controls how deep the carbon sinks. There are lots of factors which might affect how deep the material sinks but at the moment we can't be sure which ones are important. In this project we will make oceanographic expeditions to two different places to test how these different factors affect carbon storage in the deep ocean. We will measure the carbon sinking into the twilight zone and the biological processes going on within it. Then we will determine if the systems are balanced - in other words, what goes in, should come out again. We will then write equations linking all the parts of the system together and analyse them to make them more simple. At the same time we will test whether the simple equations are still useful by seeing if they produce good global maps of ocean properties for which we have lots of data. Finally, when we are happy that our new equations are doing a good job we will use them in a computer model to predict the future store of carbon in the ocean.
Molecular modelling has established itself as a powerful predictive tool for a large range of materials and phenomena whose intrinsic multiscale nature requires modelling tools able to capture their chemical, morphological and structural complexity. In the UK, the molecular modelling community, supported by the software, training and networking activities coordinated by the CCP5, has become, over the past 40 years, international-leading in this field. Building upon these successes, the new CCP5++ network will revolutionise the field of materials molecular modelling creating a new integrated community of modellers, experimentalists and data scientists that together will identify the new frontiers of the field and will transform the way these disciplines work together. To achieve its mission, the CCP5++ will coordinate and support an ambitious plan of meetings, sandpits, coding workshops, secondments and visitor schemes to cater for the large community of modellers, experimentalists and data scientists working on advanced materials. This support has proved to be vital to enable the UK condensed matter community to attain and maintain an international position at the forefront of such an intensely competitive field and will enable the UK researchers to identify and tackle major world challenges in-silico materials discovery. From the start the network memberships include key representatives of the experimental and data science communities, international software and modelling institutes, industrial collaborators and national HPC consortia and CCPs, that working together will shape the future of materials molecular modelling.