
The role of external drivers of climate change in mid-latitude weather events, particularly that of human influence on climate, arouses intense scientific, policy and public interest. In February 2014, the UK Prime Minister stated he "suspected a link" between the flooding at the time and anthropogenic climate change, but the scientific community was, and remains, frustratingly unable to provide a more quantitative assessment. Quantifying the role of climate change in extreme weather events has financial significance as well: at present, impact-relevant climate change will be primarily felt through changes in extreme events. While slow-onset processes can exacerbate (or ameliorate) the impact of individual weather events, any change in the probability of occurrence of these events themselves could overwhelm this effect. While this is known to be a problem, very little is known about the magnitude of such changes in occurrence probabilities, an important knowledge gap this project aims to address. The 2015 Paris Agreement of the UNFCCC has given renewed urgency to understanding relatively subtle changes in extreme weather through its call for research into the impacts of a 1.5oC versus 2oC increase in global temperatures, to contribute to an IPCC Special Report in 2018. Few, if any, mid-latitude weather events can be unambiguously attributed to external climate drivers in the sense that these events would not have happened at all without those drivers. Hence any comprehensive assessment of the cost of anthropogenic climate change and different levels of warming in the future must quantify the impact of changing risks of extreme weather, including subtle changes in the risks of relatively 'ordinary' events. The potential, and significance, of human influence on climate affecting the occupancy of the dynamical regimes that give rise to extreme weather in mid-latitudes has long been noted, but only recently have the first tentative reports of an attributable change in regime occupancy begun to emerge. A recent example is the 2014 floods in the Southern UK, which are thought to have occurred not because of individually heavy downpours, but because of a more persistent jet. Quantifying such changes presents a challenge because high atmospheric resolution is required for realistic simulation of the processes that give rise to weather regimes, while large ensembles are required to quantify subtle but potentially important changes in regime occupancy statistics and event frequency. Under this project we propose, for the first time, to apply a well-established large-ensemble methodology that allows explicit simulation of changing event probabilities to a global seasonal-forecast-resolution model. We aim to answer the following question: over Europe, does the dynamical response to human influence on climate, manifest through changing occupancy of circulation regimes and event frequency, exacerbate or counteract the thermodynamic response, which is primarily manifest through increased available moisture and energy in individual events? Our focus is on comparing present-day conditions with the counterfactual "world that might have been" without human influence on climate, and comparing 1.5 degree and 2 degree future scenarios. While higher forcing provides higher signal-to-noise, interpretation is complicated by changing drivers and the potential for a non-linear response. We compensate for a lower signal with unprecedentedly large ensembles. Event attribution has been recognised by the WCRP as a key component of any comprehensive package of climate services. NERC science has been instrumental in its development so far: this project will provide a long-overdue integration of attribution research into the broader agenda of understanding the dynamics of mid-latitude weather.
The 2007 floods prompted the UK Government's "Pitt review", which came up with the idea that we need to start to deal with the causes of flooding upstream of the affected communities, rather than rely solely on the downstream engineering solutions. This stimulated a range of organisations to introduce "natural" features into the landscape that may have benefits in terms of reducing flooding (so called "Natural Flood Management, NFM"). Having introduced features these organisations, and local stakeholders working with them, are increasingly asking "Are these features working?" This has highlighted to funders, those implementing the features and scientists alike that there are gaps in the evidence of how individual features (e.g. a single farm pond or a small area of tree planting) work and what are potential downstream benefits for communities at risk of flooding. Stakeholders want both questions answered at the same time, making this one of the most important academic challenges for hydrological scientists in recent years. The only way to quantify the effects of many individual features at larger scales is to use computer models. To be credible, these models also need to produce believable results at individual feature scales. Meeting this challenge is the focus of this research project. Consequently, our primary objective is to quantify the likely effectiveness of these NFM features for mitigating flood risk at large catchment scales in the most credible way. In this context, credibility means being transparent and rigorous in the way that we deal with what we do know and what we don't know when addressing this problem using models. In doing this we need to address particular scientific challenges in the following ways: * We need to show that our models are capable of reproducing downstream floods while at the same time matching observed local hydrological phenomena, such as patterns of soil saturation. Integral to our methodology are observations of these local phenomena to further strengthen the credibility of the modelling. * We use the same models to predict NFM effects by changing key model components. These changes to the components are made in a rigorous way, initially based upon the current evidence. * As evidence of change is so critical, our project necessarily includes targeted experimental work to address some of the serious evidence gaps, to significantly improve the confidence in the model results. * This rigorous strategy provides us with a platform for quantifying the magnitude of benefit that can be offered by different spatial extents of NFM implementation across large areas. By addressing these scientific goals we believe that we can deliver a step change in the confidence of our quantification of the likely effectiveness of NFM measure for mitigating flood risk at large catchment scales.
The management of water quality in rivers, urban drainage and water supply networks is essential for ecological and human well-being. Predicting the effects of management strategies requires knowledge of the hydrodynamic processes covering spatial scales of a few millimetres (turbulence) to several hundred kilometres (catchments), with a similarly large range of timescales from milliseconds to weeks. Predicting underlying water quality processes and their human and ecological impact is complicated as they are dependent on contaminant concentration. Current water quality modelling methods range from complex three dimensional computational fluid dynamics (3D CFD) models, for short time and small spatial scales, to one-dimensional (1D) time dependent models, critical for economic, fast, easy-to-use applications within highly complex situations in river catchments, water supply and urban drainage systems. Mixing effects in channels and pipes of uniform geometry can be represented with some confidence in highly turbulent, steady flows. However, in the majority of water networks, the standard 1D model predictions fall short because of knowledge gaps due to low turbulence, 3D shapes and unsteady flows. This Fellowship will work to address the knowledge gaps, delivering a step change in the predictive capability of 1D water quality network models. It will achieve this via the strategic leadership of a programme of laboratory and full-scale field measurements, the implementation of system identification techniques and active engagement with primary users. The proposal covers aspects from fundamental research, through applications, to end-user delivery, by providing a new modelling methodology to inform design, appraisal and management decisions made by environmental regulators, engineering consultants and water utilities.
Severe weather, with heavy rainfall and strong winds, has been the cause of recent dramatic land and coastal flooding, and of strong beach and cliff erosion along the British coast. Both the winters of 2012-2013 and 2013-2014 have seen severe environmental disasters in the UK. The prediction of severe rainfall and storms and its use to forecast river flooding and storm surges, as well as coastal erosion, poses a significant challenge. Uncertainties in the prediction of where and how much precipitation will fall, how high storm surges will be and from which direction waves and wind will attack coast lines, lie at the heart of this challenge. This and other environmental challenges are exacerbated by changing climate and need to be addressed urgently. As the latest IPCC reports confirms, sea level rise and storm intensity combined are very likely to cause more coastal erosion of beaches and cliffs, and of estuaries. However, it is also clear that there remains considerable uncertainty. To address the challenges posed by the prediction and mitigation of severe environmental events, many scientific and technical issues need to be tackled. These share common elements: phenomena involving a wide range of spatial and temporal scales; interaction between continuous and discrete entities; need to move from deterministic to probabilistic prediction, and from prediction to control; characterisation and sampling of extreme events; merging of models with observations through filtering; model reduction and parameter estimation. They also share a dual need for improved mathematical models and for improved numerical methods adapted to high-performance computer architectures. Since all these aspects are underpinned by mathematics, it is clear that new mathematical methods can make a major contribution to addressing the challenges posed by severe events. To achieve this, it is crucial that mathematicians with the relevant expertise interact closely with environmental scientists and with end-users of environmental research. At present, the UK suffers from limited interactions of this type. We therefore propose to establish a new Network - Maths Foresees - that will forge strong ties between researchers in the applied mathematics community with researchers in selected strategic areas of the environmental science community and governmental agencies. The activities proposed to reach our objectives include: (i) three general assemblies, (ii) three mathematics-with-industry style workshops, in which the stakeholders put forward challenges, (iii) focussed workshops on mathematical issues, (iv) outreach projects in which the science developed is demonstrated in an accessible and conceptual way to the general public, (v) feasibility projects, and (vi) workshops for user groups to disseminate the network progress to government agencies.
The aim of the project is to quantify uncertainties in the vulnerability of bridges and culverts to blockage or scour, in order to support better guidance and risk models for infrastructure managers and their partners, such as Network Rail and the Environment Agency. The approach will be a formal 'expert elicitation' to quantify the fragility of bridges and culverts at risk of scour or blockage. Erosion and blockage are two significant hazards for major infrastructure networks where they cross rivers or smaller watercourses. In the UK there are estimated to have been 15 fatalities due to flood/scour failure of a structure since the 1840s (RSSB, 2004). In recent years notable incidents include the Glanrhyd railway bridge in Wales, which collapsed in 1989 due to scour of a pier, resulting in four fatalities when a train attempted to cross the collapsed bridge and fell into the river. The Lower Ashenbottom viaduct in Lancashire failed in June 2002 as its central pier collapsed, partially due to scour during a flood event but exacerbated by the presence of debris. In the 2009 Cumbria floods, seven road and foot bridges failed due to the combination of scour and hydrodynamic loading. The collapse of the Northside road bridge in Workington caused one fatality and massive disruption. Whilst catastrophic bridge failures are rare, blockages of culverts and bridges, even over relatively small rivers, can cause flooding and erosion of safety-critical earthworks. Even minor incidents can lead to additional operational costs and risks for infrastructure operators, including those associated with debris clearance and emergency structural inspections. For the wider public, these incidents can cause disruption because of operational measures such as speed restrictions, delays, time-table changes or diversions. With nearly 10,000 bridges over watercourses on the rail network alone, the scale of the asset stock is significant. Despite industry efforts over the years, there remains much uncertainty about the individual resilience of these assets, and there is limited quantitative knowledge of this uncertainty. The uncertain and disparate nature of information about scour and blockage probabilities indicates that a formal elicitation of expert judgements will be useful in seeking to draw out a synthesis of current knowledge. Inevitably, uncertainty has a major influence on a risk assessment and on any associated decisions in circumstances such as this; a structured procedure for eliciting expert judgements from a range of opinions is needed to obtain a rational consensus on the appropriate level of uncertainty quantification to use in the appraisal of contributory factors. Soliciting expert advice for decision support is not new. Generally, however, it has been pursued on an informal basis. However, an unstructured approach is rarely, if ever, entirely satisfactory to all parties. Neither is it likely to be immune to legitimate criticism or auditing from one side or another. To address these shortcomings, structured expert judgement makes it possible to tie the whole process to stated and transparent methodological rules, with the goal of treating expert judgements in the same way as 'normal' scientific data, in a formal decision process. Various methods for assessing and combining expert uncertainty are described in the literature. Until recently, the most familiar approach has been one that advocates a group decision-conferencing framework for eliciting opinions, but other approaches now exist for carrying out this process more objectively. Prominent amongst these is the expert weighting procedure known as the Classical Model, formulated by Cooke (1991). Drawing on the knowledge and expertise of UK and international experts, this project will use Cooke's Classical Model to quantify uncertainties in the vulnerability of bridges and culverts to blockage or scour - to support better guidance and risk models for infrastructure managers.