Probabilistic seismic risk assessment involves the appraisal of three components: the seismic hazard due to expected ground motions, the exposure, i.e., assets exposed to the hazard, and their vulnerability with respect to the hazard. Earthquake sequences are a result of a time dependent process, which is complex to consider in a probabilistic framework. It is thus common to reduce sequences to their largest event, the so-called main shock. The occurrence of these main shocks can be modeled assuming a Poisson process with constant average rates. This is convenient from a mathematical point of view, as the connected probability distribution is simple, but of course, a large portion of the seismicity is neglected. As a result of the long inter-event times in such a Poissonian model, it is commonly assumed that potentially damaged buildings are repaired before the next strong event. In reality, the repair of buildings, especially if a large number have been affected, might take several years. As a consequence, buildings are likely to be still damaged when subsequent events occur. For aftershocks, which commonly occur within hours or days after the main event, this is definitely the case. Furthermore, a building’s response is not identical for its intact and damaged states. If these effects are not considered within a seismic risk assessment, then the resulting estimates will most likely fail to predict accurate results. In order to determine if this is the case over urban scales, this thesis sets out to design, develop and test an approach to relax the two assumptions with the aim of developing time- and state-dependent seismic risk models. The accompanying research questions are: 1) if this results in loss estimates that are significantly different from those of a classical model, 2) which components of the model have the strongest influence, and 3) if simplifications can be made to model the complex process. The approach developed here employs a simulation framework that models full seismic sequences using an epidemic type aftershock model, a Markov-chain damage process, state-dependent fragility models, and various probabilistic repair functions. The approach is tested for the city of Nablus in the West Bank, Palestine. For this purpose, a fully probabilistic seismic hazard model is developed for the region. A multi-source imaging analysis is employed to collect exposure data using remote rapid visual screening. For the identified building types, a simplified analysis is performed to obtain state-dependent fragility models, while the many simplifications applied don’t yield reliable absolute losses, they permit the identification of tendencies and answer the formulated research questions. The results show that a time- and state-dependent model yields significantly higher losses than a classical model, reaching a difference of up to 58% in the examples presented in this work. It is shown that considering the problem only partially is not sufficient. While including fore- and aftershocks, but no time-delayed repair and state-dependency, yields a potentially strong overestimate of losses at low exceedance probabilities, disregarding the state-dependency underestimates the losses over the whole range of probabilities. Furthermore, considering time- and state-dependency in Poissonian models does not influence the losses significantly, at least for the demonstrated case of Nablus. It can be concluded that despite the greater efforts required to develop time- and state-dependent models over urban scales, the effects are strong enough to justify the development cost if accurate risk models are intended. However, for observation periods of a single year, it was found that a constant daily repair probability may be sufficient to approximate more complex time-variant repair models. Many additional questions have emerged from the findings of this thesis. These concern the issue of more detailed analyses using more sophisticated components, mainly for the hazard and fragility models. In addition, the framework developed within this thesis can form the basis for a multitude of new research directions in the field of cascading effects. These may include the consideration of multiple hazards, dynamically evolving exposure models, and interactions between them. Eine probabilistische Erdbebenrisikoanalyse beinhaltet die Bewertung dreier Komponenten: der aus Bodenbewegungen resultierenden seismischen Gefährdung, dem Exposure, sprich Werte, die der Gefährdung ausgesetzt sind und der Vulnerabilität dieser im Hinblick auf die Gefährdung. Erdbeben ereignen sich als Folgen und sind das Ergebnis eines zeitabhängigen Prozesses, dessen wahrscheinlichkeitstheoretische Berücksichtigung kompliziert ist. Deshalb ist es üblich die Erdbebensequenz auf deren größtes Erdbeben, sogenannte Hauptbeben zu reduzieren. Das Auftreten dieser Hauptbeben kann als Realisierungen eines Poissonprozesses mit konstanten Raten simuliert werden. Dies ist aus mathematischer Sicht günstig, da die damit verknüpfte Wahrscheinlichkeitsverteilung einfach zu handhaben ist, allerdings wird durch die Reduktion ein großer Teil der Seismizität vernachlässigt. Da die Zeit zwischen zwei dieser Hauptbeben eines solchen Poissonmodelles verhältnismäßig lang ist, wird für gewöhnlich angenommen, dass die möglicherweise in einem Beben beschädigten Gebäude bis zum nächsten Beben wieder repariert sind. In Wirklichkeit kann die Reparatur von Gebäuden unter Umständen mehrere Jahre dauern, insbesondere wenn eine große Anzahl beschädigt wurde. Daraus folgt, dass im Falle weiterer Beben einige Gebäude wahrscheinlich noch beschädigt sind. Für Nachbeben, die auch innerhalb von Stunden oder Tagen nach dem Beben auftreten können, ist dies mit hoher Wahrscheinlichkeit der Fall. Des Weiteren ist die Antwort eines Gebäudes auf eine Bodenbewegung im beschädigten Zustand nicht identisch zu der eines intakten Gebäudes. Falls diese Effekte im Rahmen einer Erdbebenrisikoanalyse nicht berücksichtigt werden, ist es möglich, dass die resultierenden Abschätzungen daran scheitern akkurate Vorhersagen treffen zu wollen. Um herauszufinden ob dies über urbane Skalen der Fall ist, wird im Rahmen dieser Arbeit ein Ansatz entwickelt, implementiert und getestet, der beide klassischen Annahmen in Frage stellt, und hin zur Entwicklung von zeit- und zustandsabhängigen Erdbebenrisikomodellen führt. Die damit einhergehenden wissenschaftlichen Fragestellungen sind: 1) ob solch ein Modell Verlustabschätzungen zur Folge hat, die signifikant verschieden von denen eines Modells unter klassischen Annahmen ist, 2) welche Komponenten des Modells den stärksten Einfluss darauf haben und 3) ob zu einem gewissen Grad Vereinfachungen für das komplexe Modell möglich sind. Der hier entwickelte Ansatz baut auf eine Simulationsumgebung die vollständige Erdbebensequenzen mit Hilfe eines epidemischen Nachbebenmodelles simuliert, einen Markov-Ketten Schadensprozess nutzt und zustandsabhängige Fragilitätsmodelle sowie verschiedene Wahrscheinlichkeitsfunktionen für die Reparatur berücksichtigt. Der Ansatz wird anhand der Stadt Nablus im Westjordanland (Palästina) getestet. Zu diesem Zweck wird ein vollständig probabilistisches Erdbebengefährdungsmodell entwickelt. Zudem wird eine Satellitenbild- und Mobile-Mapping gestützte Bildanalyse multipler Quellen durchgeführt, um mit Hilfe von rascher visueller Gebäudefernerkundung Exposureinformationen zu Gebäuden zu sammeln. Für die so bestimmten Gebäudetypen wird eine einfach gehaltene Analyse durchgeführt, um zustandsabhängige Fragilitätsmodelle zu erhalten. Obwohl die zahlreichen Vereinfachungen dabei keine verlässlichen Modelle liefern, erlauben diese es Trends zu identifizieren, um die genannten wissenschaftlichen Fragestellungen zu beantworten. Die Ergebnisse zeigen, dass ein zeit- und zustandsabhängiges Modell signifikant höhere Verlustabschätzungen als ein klassischer Ansatz liefert, der Unterschied erreicht dabei bis zu 58% in den in dieser Arbeit präsentierten Beispielen. Des Weiteren kann gezeigt werden, dass eine nur teilweise Berücksichtigung der Problematik nicht ausreicht. Während ein Ansatz, der nur Vor- und Nachbeben, aber keine zeitverzögerte Reparatur berücksichtigt, Verluste am unteren Ende der Wahrscheinlichkeiten möglicherweise stark überschätzt, führt eine Vernachlässigung der Zustandsabhängigkeit der Gebäudefragilität dazu, dass die Verluste für alle Wahrscheinlichkeiten unterschätzt werden. Außerdem konnte festgestellt werden, dass die Berücksichtigung von zeit- und zustandsabhängigen Komponenten in Poissonmodellen, zumindest für den Fall von Nablus, die Verlusteinschätzungen nicht signifikant beeinflusst. Es lässt sich schlussfolgern, dass trotz des größeren Aufwandes, der nötig ist um zeit- und zustandsabhängige Modelle über urbane Skalen zu entwickeln, die beobachteten Effekte stark genug sind, um die Entwicklungskosten zu rechtfertigen, wenn akkurate Risikoeinschätzungen gewünscht sind. Für kurze Beobachtungszeiträume von einem Jahr konnte jedoch gezeigt werden, dass eine tägliche konstante Reparaturwahrscheinlichkeit möglicherweise ausreichend ist, die komplexeren zeitvariablen Reparaturmodelle anzunähern. Die Arbeit hat eine große Zahl neuer Fragen aufgeworfen. Diese betreffen etwa die Bestätigung und feinere Ausarbeitung der Trends durch detailliertere Analysen zu den einzelnen Komponenten, das heißt es können anspruchsvollere Modelle entwickel werden, vor allem für die Erdbebengefährdung und die Gebäudefragilität. Des Weiteren kann die für die Simulationen entworfene Programmrahmenstruktur als Ausgangsbasis für eine Vielzahl an neuen Forschungsarbeiten im Bereich kaskadierender Ereignisse genutzt werden. Diese können etwa die Berücksichtigung verschiedener paralleler Gefährdungen, sich dynamisch entwickelnde Expositionsmodelle und Interaktionen zwischen diesen mit einschließen.
<script type="text/javascript">
<!--
document.write('<div id="oa_widget"></div>');
document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=10.14279/depositonce-7941&type=result"></script>');
-->
</script>
Green |
citations | 0 | |
popularity | Average | |
influence | Average | |
impulse | Average |
<script type="text/javascript">
<!--
document.write('<div id="oa_widget"></div>');
document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=10.14279/depositonce-7941&type=result"></script>');
-->
</script>
doi: 10.48380/460z-0n70
We explore the effects of a changing climate on groundwater dynamics based on thermo-hydraulic simulations to reconstruct the temperature and pressure below the State of Brandenburg between 1950 and 2010. In this time period, observations point to ~1°C surface temperature warming, large annual fluctuations in groundwater recharge, and periods of high groundwater abstraction volume — all leading to water stress conditions. Our input structural model integrates Permian to Cenozoic sedimentary units with essential geological features controlling the regional groundwater flow, including salt structures, permeable glacial valleys, and aquitard discontinuities. We use a grid-based hydrologic model to derive inflow and outflow rates across the top boundary of the subsurface model. Simulation outputs are verified against data from available observation wells. The simulation results demonstrate that the regional flow pattern in the deep aquifers (>1 km deep) is mainly controlled by the basin geometry, while shallow groundwater dynamics is heavily influenced by high-frequency climate forcing. Seasonal fluctuations in groundwater level are observed in areas of shallow (
<script type="text/javascript">
<!--
document.write('<div id="oa_widget"></div>');
document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=10.48380/460z-0n70&type=result"></script>');
-->
</script>
Green |
citations | 0 | |
popularity | Average | |
influence | Average | |
impulse | Average |
<script type="text/javascript">
<!--
document.write('<div id="oa_widget"></div>');
document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=10.48380/460z-0n70&type=result"></script>');
-->
</script>
These data are supplementary material to Ziegler & Heidbach (2020) and present the results of a 3D geomechanical-numerical model of the stress state with quantified uncertainties. The average modelled stress state is provided for each of the six components of the full stress tensor. In addition, the associated standard deviation for each component is provided. The modelling approach uses a published lithological model and the used data is described in the publication Ziegler & Heidbach (2020). The reduced stress tensor is derived using the Tecplot Addon GeoStress (Stromeyer & Heidbach, 2017).The model results are provided in a comma-separated ascii file. Each line in the file represents one of the approx. 3 million finite elements that comprise the model.
<script type="text/javascript">
<!--
document.write('<div id="oa_widget"></div>');
document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=od_______156::00b06b22d9cdabe8cbc07d24ea216ed8&type=result"></script>');
-->
</script>
Green |
citations | 0 | |
popularity | Average | |
influence | Average | |
impulse | Average |
<script type="text/javascript">
<!--
document.write('<div id="oa_widget"></div>');
document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=od_______156::00b06b22d9cdabe8cbc07d24ea216ed8&type=result"></script>');
-->
</script>
The direction and strength of geomagnetic field had been evolving continuously in the past. One of the few means of obtaining continuous reconstructions of this evolution relies on sedimentary records. The latter are therefore important for understanding the geodynamo and the underlying evolution of the Earth’s interior, as well as providing an important dating tool through magnetostratigraphy. Sedimentary records of geomagnetic field variations rely on two main recording mechanisms: the alignment of magnetic particles, which underlies continuous records of relative paleointensity (RPI), and the archivation of cosmogenic isotopes, in particular ¹⁰Be, whose production by cosmic ray spallation is modulated by the screening action of the dipole component of the Earth’s field. Previous studies reported similarities as well as significant differences between RPI and cosmogenic ¹⁰Be (expressed as ¹⁰Be/⁹Be) records. While a perfect match of the two records is not expected due to environmental contaminations present in both records, the similar changes during the periods characterised by significant decrease of the dipole moment are suggested owning to global field strength control of ¹⁰Be production and attenuation of non-dipolar features in RPI records measured within the sediments with the low sedimentation rates (<10 cm/ka). The aim of the present work was to improve our present knowledge on the field recording mechanisms of marine sediments, in particular: The environmental factors responsible for ¹⁰Be transport and removal from the water column, and the effect of source distributions on ⁹Be supply. The effect of post-depositional processes, in particular sediment mixing, on ¹⁰Be and RPI records. The mechanism by which a post-depositional magnetization is acquired near the bottom of the surface mixed layer. The causes of a systematic lag between ¹⁰Be and RPI records, and the environmental factors affecting RPI. In order to disentangle the environmental and magnetic contribution in sedimentary ¹⁰Be/9Be records, we analysed five records, covering the last geomagnetic reversal. Different recording characteristics at the five sites have been described in terms of additive and multiplicative climatic modulations, which depend essentially on water depth, location along large oceanic current systems, and distance to the coast. Simple criteria have been derived for the identification of most suited sites yielding minimal environmental contaminations. A new bioturbation model has been developed to explain sedimentary NRM in bioturbated sediment. This model includes a newly discovered phenomenon of size segregation in the surface mixed layer (SML), analogous to the well-known Brazil nut effect. Size segregation is responsible for the longer permanence of larger particles in the SML, up to the limit case of ferromanganese nodules and has important implications for sediment dating with benthic foraminifera. Calibration of the bioturbation model with microtektite profiles from two Indian Ocean cores enabled to reproduce the correct degree of delay between ¹⁰Be and RPI records, as well the environmental dependence of RPI in two cores from the North Atlantic and the Equatorial Pacific Oceans. The results obtained in this work can aid in developing integrated approaches for the correction of climatic contaminations in ¹⁰Be and RPI records. Furthermore, the predictive power of the bioturbation-based model for NRM acquisition can be used to design new laboratory experiments for the simulation of specific magnetic recording mechanisms. We have demonstrated the Brazilian-nut effect on the microtektite particles, that consists in size-dependent fragments segregation. The results of this research have significant importance not only for the sediment mixing response characterisation and reconstruction of affected by bioturbation processes records (e.g. ¹⁰Be /⁹Be),.. La direction et l’intensité du champ géomagnétique ont continuellement évolué dans le passé. L’un des rares moyens d’obtenir des reconstitutions continues de cette évolution repose sur les enregistrements sédimentaires. Ces dernières sont donc importants pour comprendre la géodynamo et l’évolution sous-jacente de l’intérieur de la Terre, ainsi que pour fournir un outil de datation important grâce à la magnétostratigraphie. Les enregistrements sédimentaires des variations du champ géomagnétique reposent sur deux mécanismes d’enregistrement principaux: l’alignement des particules magnétiques, qui sous-tend les enregistrements continus de la paléointensité relative (RPI), et l’archivage des isotopes cosmogéniques, en particulier le ¹⁰Be. Bien que l’on ne s’attende pas à une concordance parfaite des deux enregistrements, les changements similaires au cours des périodes caractérisées par une diminution significative du moment dipolaire sont suggérés grâce au contrôle global du champ de production de ¹⁰Be et à l’atténuation des caractéristiques non dipolaires dans les enregistrements RPI mesurés dans les sédiments avec les faibles taux de sédimentation. L’objectif du présent travail était d’améliorer nos connaissances actuelles sur les mécanismes d’enregistrement dans le cas des sédiments marins, en particulier :1. Les facteurs environnementaux responsables du transport et du retrait du 10Be de lacolonne d’eau et l’effet de la distribution des sources sur l’approvisionnement en ⁹Be.2. L’effet des processus post-dépôt, en particulier celui du mélange des sédiments, sur le10Be et les enregistrements RPI.3. Le mécanisme par lequel une magnétisation post-déposition est acquise près du fond dela couche mélangée en surface.4. Les causes d’un décalage systématique entre les enregistrements ¹⁰Be et RPI et lesfacteurs environnementaux affectant le RPI.Afin de démêler l’apport environnemental et magnétique dans les couches sédimentaires de ¹⁰Be/⁹Be, nous avons analysé cinq enregistrements, couvrant la dernière inversion géomagnétique. Les différentes caractéristiques d’enregistrement sur les cinq sites ont été décrites en termes de modulations climatiques additives et multiplicatives, qui dépendent essentiellement de la profondeur de l’eau, de la localisation le long des grands systèmes de courants océaniques, et de la distance à la côte. Les enregistrements RPI sont fortement affectés par les processus post-dépôt, en particulier le mélange des sédiments, qui est entièrement responsable de l’aimantation rémanente naturelle (NRM) acquise par les sédiments bioturbés, et son retard par rapport aux enregistrements ¹⁰Be. Un nouveau modèle de bioturbation a été développé pour expliquer la NRM sédimentaire dans les sédiments bioturbés. Ce modèle inclut un phénomène récemment découvert de ségrégation de taille dans la couche mélangée de surface (SML), analogue à l’effet bien connu de la noix du Brésil (Brazilian-nut). La ségrégation de taille est responsable de la plus longue permanence des grandes particules dans la SML, jusqu’au cas limite des nodules de ferromanganèse. Nous avons démontré l’effet de noix du Brésil sur les particules de microtektite, qui consiste en une ségrégation des fragments en fonction de leur taille. Le modèle de ségrégation par taille permet d’estimer la profondeur réelle du mélange sédimentaire dû à la bioturbation, une fois prise en compte la composante supplémentaire de vitesse liée à la migration des plus grosses particules vers le haut de la colonne sédimentaire.Les résultats de cette recherche ont une importance significative non seulement pour la caractérisation de la réponse au mélange des sédiments et la reconstruction des enregistrements affectés des processus de bioturbation (par exemple ¹⁰Be /⁹Be), mais aussi pour l’évaluation de la validité des modèles d’âge qui sont limités par les âges des traceurs conservateurs.
<script type="text/javascript">
<!--
document.write('<div id="oa_widget"></div>');
document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=a9ac50f576aa::7f306ef83ca0e3e2ad5dfb509aa7e7ba&type=result"></script>');
-->
</script>
Green |
citations | 0 | |
popularity | Average | |
influence | Average | |
impulse | Average |
<script type="text/javascript">
<!--
document.write('<div id="oa_widget"></div>');
document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=a9ac50f576aa::7f306ef83ca0e3e2ad5dfb509aa7e7ba&type=result"></script>');
-->
</script>
doi: 10.57757/iugg23-4706
In 2022, more snow fell across the Antarctic Ice Sheet (AIS) than in any previous year over at least the last four decades. As a result, the AIS surface mass balance (SMB), which accounts for both mass gains and losses at the ice sheet surface, was also at 40+ year highs and likely led to a net negative annual contribution of the ice sheet to global sea level. Here, we assess these SMB anomalies, calculated as total precipitation minus evaporation and sublimation, as well as their drivers using the ERA5 and MERRA2 global reanalyses. Both products indicate positive SMB anomalies of >300 Gt/yr in 2022 compared to the 1991-2020 climatological mean SMB, equating to 3.1 and 2.5 standard deviations above the respective long-term means. Annual anomalies were driven by positive monthly anomalies exceeding interannual variability in January, March, July, September, and November. In this presentation, we examine the spatial and temporal character of SMB variations and their linkages to concurrent anomalies in the atmosphere and Southern Ocean. We highlight the significant impacts of landfalling atmospheric rivers in specific sectors and months and assess the potential role of Southern Ocean surface forcing, which saw record low sea ice coverage for much of 2022. This work seeks to elucidate the drivers of SMB anomalies in 2022, which holds potential implications for understanding future AIS mass variations in a warming climate. The 28th IUGG General Assembly (IUGG2023) (Berlin 2023)
<script type="text/javascript">
<!--
document.write('<div id="oa_widget"></div>');
document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=10.57757/iugg23-4706&type=result"></script>');
-->
</script>
Green |
citations | 0 | |
popularity | Average | |
influence | Average | |
impulse | Average |
<script type="text/javascript">
<!--
document.write('<div id="oa_widget"></div>');
document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=10.57757/iugg23-4706&type=result"></script>');
-->
</script>
Understanding physical processes prior and during eruptions remains challenging, due to uncertainties about subsurface structures and undetected processes within the volcano. Here, the authors use a dedicated fibre-optic cable to obtain strain data and identify volcanic events and image hidden near-surface volcanic structural features at Etna volcano, Italy. In the paper Jousset et al. (2022), we detect and characterize strain signals associated with explosions, and we find evidences for non-linear grain interactions in a scoria layer of spatially variable thickness. We also demonstrate that wavefield separation allows us to incrementally investigate the ground response to various excitation mechanisms, and we identify very small volcanic events, which we relate to fluid migration and degassing. We recorded seismic signals from natural and man-made sources with 2-m spacing along a 1.5-km-long fibre-optic cable layout near the summit of actives craters of Etna volcano, Italy. Those results provide the basis for improved volcano monitoring and hazard assessment using DAS. This data publication contains the full data set used for the analysis. This data set comprises strain-rate data from 1 iDAS interrogator (~750 traces), velocity data from 15 geophones and 4 broadband seismometers, and infrasonic pressure data from infrasound sensors. For further explanation of the data and related processing steps, please refer to Jousset et al. (2022).
<script type="text/javascript">
<!--
document.write('<div id="oa_widget"></div>');
document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=od_______156::d836c6faaaf613d4f3c57ceaef822577&type=result"></script>');
-->
</script>
Green |
citations | 0 | |
popularity | Average | |
influence | Average | |
impulse | Average |
<script type="text/javascript">
<!--
document.write('<div id="oa_widget"></div>');
document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=od_______156::d836c6faaaf613d4f3c57ceaef822577&type=result"></script>');
-->
</script>
GFZ acts as a global analysis center of the International GNSS Service (IGS) and provides ultra-rapid (last 24h), rapid (last day), and final (last week) solutions for GPS and GLONASS. The ultra-rapid solution series is published eight times per day with a delay of around three hours. The 3D seismic velocity models are results of a local earthquake tomography which is performed to illuminate the crustal and uppermost mantle structure beneath the southern Puna plateau and to test the delamination hypothesis. The Southern Puna is distinctive from the rest of the Central Andean plateau in having a higher topographic elevation, a thinner lithosphere and in being flanked to the south by the Chilean flat slab region. Previous investigations involving geochemical, geological and geophysical observations, have invoked lithospheric delamination to explain the distinctive magmatic and structural history, elevation and lithospheric thickness of the region. In the present study, Vp and Vp/Vs ratios were obtained using travel time variations recorded by 75 temporary seismic stations between 2007 and 2009. The earthquakes catalog (Mulcahy et al., 2014) contains 1903 local earthquakes (25077 P- and 14059 S-picks). A minimum 1D model is derived with software VELEST (Kissling et al., 1995). The 3D tomographic inversion is performed with software SIMULPS (Thurber, 1983; Evans et al., 1994). Spread values are used to define well resolved model domains (6 for Vp and 5.5 for Vp/Vs), which are calculated from the model resolution matrix (Toomey & Foulger, 1989). The data are provided as one tar.gz archive. Individual ASCII files contain, at each depth from 0 to 200 km: - Vp model (model.vp.depth_???km), format: longitude, latitude, depth, Vp perturbation, absolute Vp - Vp/Vs model (model.vpvs.depth_???km), format: longitude, latitude, depth, Vp/Vs perturbation, absolute Vp/Vs - spread values for Vp (spread.vp.depth_???km), format: longitude, latitude, depth, spread value - spread values for Vp/Vs model (spread.vpvs.depth_???km), format: longitude, latitude, depth, spread value
<script type="text/javascript">
<!--
document.write('<div id="oa_widget"></div>');
document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=od_______156::c3c3ab2abf22074c85cef91178f1abca&type=result"></script>');
-->
</script>
Green |
citations | 0 | |
popularity | Average | |
influence | Average | |
impulse | Average |
<script type="text/javascript">
<!--
document.write('<div id="oa_widget"></div>');
document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=od_______156::c3c3ab2abf22074c85cef91178f1abca&type=result"></script>');
-->
</script>
doi: 10.15488/14308
Over the past few decades, the occurrence and intensity of geological hazards, such as landslides, have substantially risen due to various factors, including global climate change, seismic events, rapid urbanization and other anthropogenic activities. Landslide disasters pose a significant risk in both urban and rural areas, resulting in fatalities, infrastructure damages, and economic losses. Nevertheless, conventional ground-based monitoring techniques are often costly, time-consuming, and require considerable resources. Moreover, some landslide incidents occur in remote or hazardous locations, making ground-based observation and field investigation challenging or even impossible. Fortunately, the advancements in spaceborne remote sensing technology have led to the availability of large-scale and high-quality imagery, which can be utilized for various landslide-related applications, including identification, monitoring, analysis, and prediction. This efficient and cost-effective technology allows for remote monitoring and assessment of landslide risks and can significantly contribute to disaster management and mitigation efforts. Consequently, spaceborne remote sensing techniques have become vital for geohazard management in many countries, benefiting society by providing reliable downstream services. However, substantial effort is required to ensure that such benefits are provided. For establishing long-term data archives and reliable analyses, it is essential to maintain consistent and continued use of multi-sensor spaceborne remote sensing techniques. This will enable a more thorough understanding of the physical mechanisms responsible for slope instabilities, leading to better decision-making and development of effective mitigation strategies. Ultimately, this can reduce the impact of landslide hazards on the general public. The present dissertation contributes to this effort from the following perspectives: 1. To obtain a comprehensive understanding of spaceborne remote sensing techniques for landslide monitoring, we integrated multi-sensor methods to monitor the entire life cycle of landslide dynamics. We aimed to comprehend the landslide evolution under complex cascading events by utilizing various spaceborne remote sensing techniques, e.g., the precursory deformation before catastrophic failure, co-failure procedures, and post-failure evolution of slope instability. 2. To address the discrepancies between spaceborne optical and radar imagery, we present a methodology that models four-dimensional (4D) post-failure landslide kinematics using a decaying mathematical model. This approach enables us to represent the stress relaxation for the landslide body dynamics after failure. By employing this methodology, we can overcome the weaknesses of the individual sensor in spaceborne optical and radar imaging. 3. We assessed the effectiveness of a newly designed small dihedral corner reflector for landslide monitoring. The reflector is compatible with both ascending and descending satellite orbits, while it is also suitable for applications with both high-resolution and medium-resolution satellite imagery. Furthermore, although its echoes are not as strong as those of conventional reflectors, the cost of the newly designed reflectors is reduced, with more manageable installation and maintenance. To overcome this limitation, we propose a specific selection strategy based on a probability model to identify the reflectors in satellite images.
<script type="text/javascript">
<!--
document.write('<div id="oa_widget"></div>');
document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=10.15488/14308&type=result"></script>');
-->
</script>
Green |
citations | 0 | |
popularity | Average | |
influence | Average | |
impulse | Average |
<script type="text/javascript">
<!--
document.write('<div id="oa_widget"></div>');
document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=10.15488/14308&type=result"></script>');
-->
</script>
Response Spectrum Analysis (RSA), one of the most popular methods to carry out the seismic design of multi-degree-offreedom (MDOF) structures, is based on the concept of modal superposition, by which the uncoupled equations of motion that represent each mode of vibration of the system can be solved independently and the resulting responses superimposed by assuming linear elastic behaviour. Each mode is represented by a single-degree-of-freedom (SDOF) system, whose peak response is retrieved from response spectra deemed suitable for design. However, while modal superposition allows for the total response of a MDOF system to be determined by simple addition of the individual modal responses at each time step, combination of spectral values needs to take into account the fact that peak modal responses do not necessarily occur at the same time or along the same horizontal directions. These considerations give rise to the use of modal and spatial combination rules that aim to calculate the likely peak response of a MDOF system instead of conservatively carrying out an algebraic sum of maxima. Current design codes prescribe methodologies that were defined in the 1970s and 1980s, such as the Complete Quadratic Combination (CQC) [1], its three-dimensional extension CQC3 [2], the Square Root of the Sum of the Squares (SRSS) [3], or the 30% rules [4], based mostly on random vibration theory. However, access to large numbers of ground motion records at the present time allow us to revisit these approaches from a data-driven perspective, and investigate the relationship across the peaks of SDOF responses to seismic excitation at different orientations and at different points in time, with the ultimate goal of characterising this relationship in a fully probabilistic way. This paper presents results of a study of SDOF demands obtained considering 1,218 accelerograms from the RESORCE database [5], whose two horizontal perpendicular components were rotated around all non-redundant angles every 2° and applied to SDOF systems with periods of vibration of 0.2, 1.0 and 3.0 seconds, and sets of secondary systems with periods ranging from 0.5 through 0.95 times the three aforementioned periods. The concept of peak response was extended to include all peaks with amplitudes above two alternative thresholds of 80% and 95% of the maximum absolute response. Two main kinds of parameters were studied and are presented: (i) time differences between peaks of the same component and across perpendicular components, and (ii) ratios of instantaneous displacement demands between perpendicular components and the same component for different oscillator periods, as one of the components reaches a peak in the oscillator’s response. While results for the latter resemble the idea of the 0.3 coefficient from the 30% rule in average terms, the dispersion associated with all these parameters is large and should not be neglected.
<script type="text/javascript">
<!--
document.write('<div id="oa_widget"></div>');
document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=od_______156::4d2f1d32442996a761b7a2d1249abfb5&type=result"></script>');
-->
</script>
Green |
citations | 0 | |
popularity | Average | |
influence | Average | |
impulse | Average |
<script type="text/javascript">
<!--
document.write('<div id="oa_widget"></div>');
document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=od_______156::4d2f1d32442996a761b7a2d1249abfb5&type=result"></script>');
-->
</script>
The GFZ-Landsvirkjun Theistareykir Fibre array is located in the Theytareykir geothermal area, in North Iceland. It is collocated with arrays of broadband seismometers and gravity meters (see e.g., https://doi.org/10.1186/s40517-021-00208-w). The geometry of the fibre array is following the telecom network in the area, and was chosen to test the seismological capabilities of telecom cables in this geothermal environment. We connected an iDAS V2 interrogator from Silixa. The interrogator location is lat=65.898041, lon=-16.966274. The array starts N-S and after 1.5 km, turns towards the East, up to a local transmission antenna station for mobile phones. The length of the path is ~5 km. The length of the cable is actually more than 15 km, as other fibre instance is connected at the transmission antenna station.. Jumps were performed along the cable to geo-locate the channels. The exact location of the fibre can unfortunately not be disclosed. Original recordings at 1000 Hz were downsampled to 200 Hz using a software from INGV-OE (michele.prestifilippo@ingv.it) and are provided in an h5 format. We provide here the first fibre instance (5 km long). The data contain 1 h long recording intervals framing M>5 teleseismic earthquakes recorded in the frame of the global DAS month, an initiative to collaboratively record and share simultaneously recorded DAS data from all over the world (https://www.norsar.no/in-focus/global-das-monitoring-month-february-2023). DAS is an emerging technology increasingly used by seismologists to convert kilometer long optical fibers into seismic sensors.
<script type="text/javascript">
<!--
document.write('<div id="oa_widget"></div>');
document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=od_______156::e3edaf1f64598d7ab6f0fdd0b6293a47&type=result"></script>');
-->
</script>
Green |
citations | 0 | |
popularity | Average | |
influence | Average | |
impulse | Average |
<script type="text/javascript">
<!--
document.write('<div id="oa_widget"></div>');
document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=od_______156::e3edaf1f64598d7ab6f0fdd0b6293a47&type=result"></script>');
-->
</script>
Probabilistic seismic risk assessment involves the appraisal of three components: the seismic hazard due to expected ground motions, the exposure, i.e., assets exposed to the hazard, and their vulnerability with respect to the hazard. Earthquake sequences are a result of a time dependent process, which is complex to consider in a probabilistic framework. It is thus common to reduce sequences to their largest event, the so-called main shock. The occurrence of these main shocks can be modeled assuming a Poisson process with constant average rates. This is convenient from a mathematical point of view, as the connected probability distribution is simple, but of course, a large portion of the seismicity is neglected. As a result of the long inter-event times in such a Poissonian model, it is commonly assumed that potentially damaged buildings are repaired before the next strong event. In reality, the repair of buildings, especially if a large number have been affected, might take several years. As a consequence, buildings are likely to be still damaged when subsequent events occur. For aftershocks, which commonly occur within hours or days after the main event, this is definitely the case. Furthermore, a building’s response is not identical for its intact and damaged states. If these effects are not considered within a seismic risk assessment, then the resulting estimates will most likely fail to predict accurate results. In order to determine if this is the case over urban scales, this thesis sets out to design, develop and test an approach to relax the two assumptions with the aim of developing time- and state-dependent seismic risk models. The accompanying research questions are: 1) if this results in loss estimates that are significantly different from those of a classical model, 2) which components of the model have the strongest influence, and 3) if simplifications can be made to model the complex process. The approach developed here employs a simulation framework that models full seismic sequences using an epidemic type aftershock model, a Markov-chain damage process, state-dependent fragility models, and various probabilistic repair functions. The approach is tested for the city of Nablus in the West Bank, Palestine. For this purpose, a fully probabilistic seismic hazard model is developed for the region. A multi-source imaging analysis is employed to collect exposure data using remote rapid visual screening. For the identified building types, a simplified analysis is performed to obtain state-dependent fragility models, while the many simplifications applied don’t yield reliable absolute losses, they permit the identification of tendencies and answer the formulated research questions. The results show that a time- and state-dependent model yields significantly higher losses than a classical model, reaching a difference of up to 58% in the examples presented in this work. It is shown that considering the problem only partially is not sufficient. While including fore- and aftershocks, but no time-delayed repair and state-dependency, yields a potentially strong overestimate of losses at low exceedance probabilities, disregarding the state-dependency underestimates the losses over the whole range of probabilities. Furthermore, considering time- and state-dependency in Poissonian models does not influence the losses significantly, at least for the demonstrated case of Nablus. It can be concluded that despite the greater efforts required to develop time- and state-dependent models over urban scales, the effects are strong enough to justify the development cost if accurate risk models are intended. However, for observation periods of a single year, it was found that a constant daily repair probability may be sufficient to approximate more complex time-variant repair models. Many additional questions have emerged from the findings of this thesis. These concern the issue of more detailed analyses using more sophisticated components, mainly for the hazard and fragility models. In addition, the framework developed within this thesis can form the basis for a multitude of new research directions in the field of cascading effects. These may include the consideration of multiple hazards, dynamically evolving exposure models, and interactions between them. Eine probabilistische Erdbebenrisikoanalyse beinhaltet die Bewertung dreier Komponenten: der aus Bodenbewegungen resultierenden seismischen Gefährdung, dem Exposure, sprich Werte, die der Gefährdung ausgesetzt sind und der Vulnerabilität dieser im Hinblick auf die Gefährdung. Erdbeben ereignen sich als Folgen und sind das Ergebnis eines zeitabhängigen Prozesses, dessen wahrscheinlichkeitstheoretische Berücksichtigung kompliziert ist. Deshalb ist es üblich die Erdbebensequenz auf deren größtes Erdbeben, sogenannte Hauptbeben zu reduzieren. Das Auftreten dieser Hauptbeben kann als Realisierungen eines Poissonprozesses mit konstanten Raten simuliert werden. Dies ist aus mathematischer Sicht günstig, da die damit verknüpfte Wahrscheinlichkeitsverteilung einfach zu handhaben ist, allerdings wird durch die Reduktion ein großer Teil der Seismizität vernachlässigt. Da die Zeit zwischen zwei dieser Hauptbeben eines solchen Poissonmodelles verhältnismäßig lang ist, wird für gewöhnlich angenommen, dass die möglicherweise in einem Beben beschädigten Gebäude bis zum nächsten Beben wieder repariert sind. In Wirklichkeit kann die Reparatur von Gebäuden unter Umständen mehrere Jahre dauern, insbesondere wenn eine große Anzahl beschädigt wurde. Daraus folgt, dass im Falle weiterer Beben einige Gebäude wahrscheinlich noch beschädigt sind. Für Nachbeben, die auch innerhalb von Stunden oder Tagen nach dem Beben auftreten können, ist dies mit hoher Wahrscheinlichkeit der Fall. Des Weiteren ist die Antwort eines Gebäudes auf eine Bodenbewegung im beschädigten Zustand nicht identisch zu der eines intakten Gebäudes. Falls diese Effekte im Rahmen einer Erdbebenrisikoanalyse nicht berücksichtigt werden, ist es möglich, dass die resultierenden Abschätzungen daran scheitern akkurate Vorhersagen treffen zu wollen. Um herauszufinden ob dies über urbane Skalen der Fall ist, wird im Rahmen dieser Arbeit ein Ansatz entwickelt, implementiert und getestet, der beide klassischen Annahmen in Frage stellt, und hin zur Entwicklung von zeit- und zustandsabhängigen Erdbebenrisikomodellen führt. Die damit einhergehenden wissenschaftlichen Fragestellungen sind: 1) ob solch ein Modell Verlustabschätzungen zur Folge hat, die signifikant verschieden von denen eines Modells unter klassischen Annahmen ist, 2) welche Komponenten des Modells den stärksten Einfluss darauf haben und 3) ob zu einem gewissen Grad Vereinfachungen für das komplexe Modell möglich sind. Der hier entwickelte Ansatz baut auf eine Simulationsumgebung die vollständige Erdbebensequenzen mit Hilfe eines epidemischen Nachbebenmodelles simuliert, einen Markov-Ketten Schadensprozess nutzt und zustandsabhängige Fragilitätsmodelle sowie verschiedene Wahrscheinlichkeitsfunktionen für die Reparatur berücksichtigt. Der Ansatz wird anhand der Stadt Nablus im Westjordanland (Palästina) getestet. Zu diesem Zweck wird ein vollständig probabilistisches Erdbebengefährdungsmodell entwickelt. Zudem wird eine Satellitenbild- und Mobile-Mapping gestützte Bildanalyse multipler Quellen durchgeführt, um mit Hilfe von rascher visueller Gebäudefernerkundung Exposureinformationen zu Gebäuden zu sammeln. Für die so bestimmten Gebäudetypen wird eine einfach gehaltene Analyse durchgeführt, um zustandsabhängige Fragilitätsmodelle zu erhalten. Obwohl die zahlreichen Vereinfachungen dabei keine verlässlichen Modelle liefern, erlauben diese es Trends zu identifizieren, um die genannten wissenschaftlichen Fragestellungen zu beantworten. Die Ergebnisse zeigen, dass ein zeit- und zustandsabhängiges Modell signifikant höhere Verlustabschätzungen als ein klassischer Ansatz liefert, der Unterschied erreicht dabei bis zu 58% in den in dieser Arbeit präsentierten Beispielen. Des Weiteren kann gezeigt werden, dass eine nur teilweise Berücksichtigung der Problematik nicht ausreicht. Während ein Ansatz, der nur Vor- und Nachbeben, aber keine zeitverzögerte Reparatur berücksichtigt, Verluste am unteren Ende der Wahrscheinlichkeiten möglicherweise stark überschätzt, führt eine Vernachlässigung der Zustandsabhängigkeit der Gebäudefragilität dazu, dass die Verluste für alle Wahrscheinlichkeiten unterschätzt werden. Außerdem konnte festgestellt werden, dass die Berücksichtigung von zeit- und zustandsabhängigen Komponenten in Poissonmodellen, zumindest für den Fall von Nablus, die Verlusteinschätzungen nicht signifikant beeinflusst. Es lässt sich schlussfolgern, dass trotz des größeren Aufwandes, der nötig ist um zeit- und zustandsabhängige Modelle über urbane Skalen zu entwickeln, die beobachteten Effekte stark genug sind, um die Entwicklungskosten zu rechtfertigen, wenn akkurate Risikoeinschätzungen gewünscht sind. Für kurze Beobachtungszeiträume von einem Jahr konnte jedoch gezeigt werden, dass eine tägliche konstante Reparaturwahrscheinlichkeit möglicherweise ausreichend ist, die komplexeren zeitvariablen Reparaturmodelle anzunähern. Die Arbeit hat eine große Zahl neuer Fragen aufgeworfen. Diese betreffen etwa die Bestätigung und feinere Ausarbeitung der Trends durch detailliertere Analysen zu den einzelnen Komponenten, das heißt es können anspruchsvollere Modelle entwickel werden, vor allem für die Erdbebengefährdung und die Gebäudefragilität. Des Weiteren kann die für die Simulationen entworfene Programmrahmenstruktur als Ausgangsbasis für eine Vielzahl an neuen Forschungsarbeiten im Bereich kaskadierender Ereignisse genutzt werden. Diese können etwa die Berücksichtigung verschiedener paralleler Gefährdungen, sich dynamisch entwickelnde Expositionsmodelle und Interaktionen zwischen diesen mit einschließen.
<script type="text/javascript">
<!--
document.write('<div id="oa_widget"></div>');
document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=10.14279/depositonce-7941&type=result"></script>');
-->
</script>
Green |
citations | 0 | |
popularity | Average | |
influence | Average | |
impulse | Average |
<script type="text/javascript">
<!--
document.write('<div id="oa_widget"></div>');
document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=10.14279/depositonce-7941&type=result"></script>');
-->
</script>
doi: 10.48380/460z-0n70
We explore the effects of a changing climate on groundwater dynamics based on thermo-hydraulic simulations to reconstruct the temperature and pressure below the State of Brandenburg between 1950 and 2010. In this time period, observations point to ~1°C surface temperature warming, large annual fluctuations in groundwater recharge, and periods of high groundwater abstraction volume — all leading to water stress conditions. Our input structural model integrates Permian to Cenozoic sedimentary units with essential geological features controlling the regional groundwater flow, including salt structures, permeable glacial valleys, and aquitard discontinuities. We use a grid-based hydrologic model to derive inflow and outflow rates across the top boundary of the subsurface model. Simulation outputs are verified against data from available observation wells. The simulation results demonstrate that the regional flow pattern in the deep aquifers (>1 km deep) is mainly controlled by the basin geometry, while shallow groundwater dynamics is heavily influenced by high-frequency climate forcing. Seasonal fluctuations in groundwater level are observed in areas of shallow (
<script type="text/javascript">
<!--
document.write('<div id="oa_widget"></div>');
document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=10.48380/460z-0n70&type=result"></script>');
-->
</script>
Green |
citations | 0 | |
popularity | Average | |
influence | Average | |
impulse | Average |
<script type="text/javascript">
<!--
document.write('<div id="oa_widget"></div>');
document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=10.48380/460z-0n70&type=result"></script>');
-->
</script>
These data are supplementary material to Ziegler & Heidbach (2020) and present the results of a 3D geomechanical-numerical model of the stress state with quantified uncertainties. The average modelled stress state is provided for each of the six components of the full stress tensor. In addition, the associated standard deviation for each component is provided. The modelling approach uses a published lithological model and the used data is described in the publication Ziegler & Heidbach (2020). The reduced stress tensor is derived using the Tecplot Addon GeoStress (Stromeyer & Heidbach, 2017).The model results are provided in a comma-separated ascii file. Each line in the file represents one of the approx. 3 million finite elements that comprise the model.
<script type="text/javascript">
<!--
document.write('<div id="oa_widget"></div>');
document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=od_______156::00b06b22d9cdabe8cbc07d24ea216ed8&type=result"></script>');
-->
</script>
Green |
citations | 0 | |
popularity | Average | |
influence | Average | |
impulse | Average |
<script type="text/javascript">
<!--
document.write('<div id="oa_widget"></div>');
document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=od_______156::00b06b22d9cdabe8cbc07d24ea216ed8&type=result"></script>');
-->
</script>
The direction and strength of geomagnetic field had been evolving continuously in the past. One of the few means of obtaining continuous reconstructions of this evolution relies on sedimentary records. The latter are therefore important for understanding the geodynamo and the underlying evolution of the Earth’s interior, as well as providing an important dating tool through magnetostratigraphy. Sedimentary records of geomagnetic field variations rely on two main recording mechanisms: the alignment of magnetic particles, which underlies continuous records of relative paleointensity (RPI), and the archivation of cosmogenic isotopes, in particular ¹⁰Be, whose production by cosmic ray spallation is modulated by the screening action of the dipole component of the Earth’s field. Previous studies reported similarities as well as significant differences between RPI and cosmogenic ¹⁰Be (expressed as ¹⁰Be/⁹Be) records. While a perfect match of the two records is not expected due to environmental contaminations present in both records, the similar changes during the periods characterised by significant decrease of the dipole moment are suggested owning to global field strength control of ¹⁰Be production and attenuation of non-dipolar features in RPI records measured within the sediments with the low sedimentation rates (<10 cm/ka). The aim of the present work was to improve our present knowledge on the field recording mechanisms of marine sediments, in particular: The environmental factors responsible for ¹⁰Be transport and removal from the water column, and the effect of source distributions on ⁹Be supply. The effect of post-depositional processes, in particular sediment mixing, on ¹⁰Be and RPI records. The mechanism by which a post-depositional magnetization is acquired near the bottom of the surface mixed layer. The causes of a systematic lag between ¹⁰Be and RPI records, and the environmental factors affecting RPI. In order to disentangle the environmental and magnetic contribution in sedimentary ¹⁰Be/9Be records, we analysed five records, covering the last geomagnetic reversal. Different recording characteristics at the five sites have been described in terms of additive and multiplicative climatic modulations, which depend essentially on water depth, location along large oceanic current systems, and distance to the coast. Simple criteria have been derived for the identification of most suited sites yielding minimal environmental contaminations. A new bioturbation model has been developed to explain sedimentary NRM in bioturbated sediment. This model includes a newly discovered phenomenon of size segregation in the surface mixed layer (SML), analogous to the well-known Brazil nut effect. Size segregation is responsible for the longer permanence of larger particles in the SML, up to the limit case of ferromanganese nodules and has important implications for sediment dating with benthic foraminifera. Calibration of the bioturbation model with microtektite profiles from two Indian Ocean cores enabled to reproduce the correct degree of delay between ¹⁰Be and RPI records, as well the environmental dependence of RPI in two cores from the North Atlantic and the Equatorial Pacific Oceans. The results obtained in this work can aid in developing integrated approaches for the correction of climatic contaminations in ¹⁰Be and RPI records. Furthermore, the predictive power of the bioturbation-based model for NRM acquisition can be used to design new laboratory experiments for the simulation of specific magnetic recording mechanisms. We have demonstrated the Brazilian-nut effect on the microtektite particles, that consists in size-dependent fragments segregation. The results of this research have significant importance not only for the sediment mixing response characterisation and reconstruction of affected by bioturbation processes records (e.g. ¹⁰Be /⁹Be),.. La direction et l’intensité du champ géomagnétique ont continuellement évolué dans le passé. L’un des rares moyens d’obtenir des reconstitutions continues de cette évolution repose sur les enregistrements sédimentaires. Ces dernières sont donc importants pour comprendre la géodynamo et l’évolution sous-jacente de l’intérieur de la Terre, ainsi que pour fournir un outil de datation important grâce à la magnétostratigraphie. Les enregistrements sédimentaires des variations du champ géomagnétique reposent sur deux mécanismes d’enregistrement principaux: l’alignement des particules magnétiques, qui sous-tend les enregistrements continus de la paléointensité relative (RPI), et l’archivage des isotopes cosmogéniques, en particulier le ¹⁰Be. Bien que l’on ne s’attende pas à une concordance parfaite des deux enregistrements, les changements similaires au cours des périodes caractérisées par une diminution significative du moment dipolaire sont suggérés grâce au contrôle global du champ de production de ¹⁰Be et à l’atténuation des caractéristiques non dipolaires dans les enregistrements RPI mesurés dans les sédiments avec les faibles taux de sédimentation. L’objectif du présent travail était d’améliorer nos connaissances actuelles sur les mécanismes d’enregistrement dans le cas des sédiments marins, en particulier :1. Les facteurs environnementaux responsables du transport et du retrait du 10Be de lacolonne d’eau et l’effet de la distribution des sources sur l’approvisionnement en ⁹Be.2. L’effet des processus post-dépôt, en particulier celui du mélange des sédiments, sur le10Be et les enregistrements RPI.3. Le mécanisme par lequel une magnétisation post-déposition est acquise près du fond dela couche mélangée en surface.4. Les causes d’un décalage systématique entre les enregistrements ¹⁰Be et RPI et lesfacteurs environnementaux affectant le RPI.Afin de démêler l’apport environnemental et magnétique dans les couches sédimentaires de ¹⁰Be/⁹Be, nous avons analysé cinq enregistrements, couvrant la dernière inversion géomagnétique. Les différentes caractéristiques d’enregistrement sur les cinq sites ont été décrites en termes de modulations climatiques additives et multiplicatives, qui dépendent essentiellement de la profondeur de l’eau, de la localisation le long des grands systèmes de courants océaniques, et de la distance à la côte. Les enregistrements RPI sont fortement affectés par les processus post-dépôt, en particulier le mélange des sédiments, qui est entièrement responsable de l’aimantation rémanente naturelle (NRM) acquise par les sédiments bioturbés, et son retard par rapport aux enregistrements ¹⁰Be. Un nouveau modèle de bioturbation a été développé pour expliquer la NRM sédimentaire dans les sédiments bioturbés. Ce modèle inclut un phénomène récemment découvert de ségrégation de taille dans la couche mélangée de surface (SML), analogue à l’effet bien connu de la noix du Brésil (Brazilian-nut). La ségrégation de taille est responsable de la plus longue permanence des grandes particules dans la SML, jusqu’au cas limite des nodules de ferromanganèse. Nous avons démontré l’effet de noix du Brésil sur les particules de microtektite, qui consiste en une ségrégation des fragments en fonction de leur taille. Le modèle de ségrégation par taille permet d’estimer la profondeur réelle du mélange sédimentaire dû à la bioturbation, une fois prise en compte la composante supplémentaire de vitesse liée à la migration des plus grosses particules vers le haut de la colonne sédimentaire.Les résultats de cette recherche ont une importance significative non seulement pour la caractérisation de la réponse au mélange des sédiments et la reconstruction des enregistrements affectés des processus de bioturbation (par exemple ¹⁰Be /⁹Be), mais aussi pour l’évaluation de la validité des modèles d’âge qui sont limités par les âges des traceurs conservateurs.
<script type="text/javascript">
<!--
document.write('<div id="oa_widget"></div>');
document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=a9ac50f576aa::7f306ef83ca0e3e2ad5dfb509aa7e7ba&type=result"></script>');
-->
</script>
Green |
citations | 0 | |
popularity | Average | |
influence | Average | |
impulse | Average |