Advanced search in
Research products
arrow_drop_down
Searching FieldsTerms
Any field
arrow_drop_down
includes
arrow_drop_down
Include:
84 Research products, page 1 of 9

  • Publications
  • Research software
  • Preprint
  • Geosciences

10
arrow_drop_down
Relevance
arrow_drop_down
  • Open Access
    Authors: 
    Aldo Piombino; Filippo Bernardini; Gregorio Farolfi;
    Publisher: MDPI AG

    Recently, a new strain rate map of Italy and the surrounding areas has been obtained by processing data acquired by the persistent scatterers (PS) of the synthetic aperture radar interferometry (InSAR) satellites—ERS and ENVISAT—between 1990 and 2012. This map clearly shows that there is a link between the strain rate and all the shallow earthquakes (less than 15 km deep) that occurred from 1990 to today, with their epicenters being placed only in high strain rate areas (e.g., Emilia plain, NW Tuscany, Central Apennines). However, the map also presents various regions with high strain rates but in which no damaging earthquakes have occurred since 1990. One of these regions is the Apennine sector, formed by Sannio and Irpinia. This area represents one of the most important seismic districts with a well-known and recorded seismicity from Roman times up to the present day. In our study, we merged historical records with new satellite techniques that allow for the precise determination of ground movements, and then derived physical dimensions, such as strain rate. In this way, we verified that in Irpinia, the occurrence of new strong shocks—forty years after one of the strongest known seismic events in the district that occurred on the 23 November 1980, measuring Mw 6.8—is still a realistic possibility. The reason for this is that, from 1990, only areas characterized by high strain rates have hosted significant earthquakes. This picture has been also confirmed by analyzing the historical catalog of events with seismic completeness for magnitude M ≥ 6 over the last four centuries. It is easy to see that strong seismic events with magnitude M ≥ 6 generally occurred at a relatively short time distance between one another, with a period of 200 years without strong earthquakes between the years 1732 and 1930. This aspect must be considered as very important from various points of view, particularly for civil protection plans, as well as civil engineering and urban planning development.

  • Publication . Preprint . Article . 2019
    Open Access

    4 km southwest of College Park and dissipated near Columbia, Howard County. The supercell tracked approximately 120 km and lasted for about 126 min. This study presents a synoptic and mesoscale overview of favorable conditions and forcing mechanisms that resulted in the severe convective outbreak associated with the College Park tornado. The results show many critical elements of the tornadic event, including a negative-tilted upper-level trough over the Ohio Valley, a jet stream with moderate vertical shear, a low-level warm, moist tongue of the air associated with strong southerly flow over south-central Maryland and Virginia, and significantly increased convective available potential energy (CAPE) during the late afternoon hours. A possible role of the urban heat island effects from Washington, DC, in increasing CAPE for the development of the supercell is discussed. Satellite imagery reveals the banded convective morphology with high cloud tops associated with the supercell that produced the College Park tornado. Operational WSR-88D data exhibit a high reflectivity &ldquo or tornadic debris signature (TDS) within the hook echo, the evolution of the parent storm from a supercell structure to a bow echo, and a tornado cyclone signature (TCS). Many of the mesoscale features could be captured by contemporary numerical model analyses. This study concludes with a discussion of the effectiveness of the coordinated use of satellite and radar observations in the operational environment of nowcasting severe convection. The 24 September 2001 College Park, Maryland, tornado was a long-track and strong tornado that passed within a close range of two Doppler radars. It was the third in a series of three tornadoes associated with a supercell storm that developed in Stafford County, Virginia, and initiated 3&ndash debris ball&rdquo

  • Open Access English
    Authors: 
    Elisabeth Maidl; Matthias Buchecker;
    Publisher: MDPI AG

    meanings of risk actually comprise. To address this gap, we examine the meanings of risk by applying a social representations approach within a qualitative case study research design. Results of the study among inhabitants of Swiss mountain villages show that differences in meanings were found according to hazard experience and community size. We found commonly shared core representations and peripheral ones. We conclude with suggestions on how to make usage of the knowledge on SR in risk communication. understanding. We suggest to instead treat the variety of meanings as a resource for risk communication strategies. However, there is however to date no investigation of what laypersons&rsquo is connoted with divergent meanings in natural hazard risk research and the practice of risk management. Whilst the technical definition is accurately defined, in practice, the term &ldquo . Considering this divergence as a deficiency, risk communication often aims to correct laypersons&rsquo is often synonymously used with &ldquo The term &ldquo danger&rdquo risk&rdquo

  • Open Access English
    Authors: 
    Matteo Gentilucci; Maurizio Barbieri; Peter Burt; Fabrizio D'Aprile;
    Publisher: MDPI
    Countries: United Kingdom, Italy

    This study provides a unique procedure for validating and reconstructing temperature and precipitation data. Although developed from data in Middle Italy, the validation method is intended to be universal, subject to appropriate calibration according to the climate zones analysed. This research is an attempt to create shared applicative procedures that are most of the time only theorized or included in some software without a clear definition of the methods. The purpose is to detect most types of errors according to the procedures for data validation prescribed by the World Meteorological Organization, defining practical operations for each of the five types of data controls: gross error checking, internal consistency check, tolerance test, temporal consistency, and spatial consistency. Temperature and precipitation data over the period 1931&ndash 2014 were investigated. The outcomes of this process have led to the removal of 375 records (0.02%) of temperature data from 40 weather stations and 1286 records (1.67%) of precipitation data from 118 weather stations, and 171 data points reconstructed. In conclusion, this work contributes to the development of standardized methodologies to validate climate data and provides an innovative procedure to reconstruct missing data in the absence of reliable reference time series.

  • Open Access English
    Authors: 
    Daniel D. Marshall; Carol-Anne Nicol; Robert Greene; Rick Sawyer; Armond Stansell; Ross Easterbrook;
    Country: Canada
    Project: NSERC

    Gold, present as electrum, in the Battle Gap, Ridge North-West, HW, and Price deposits at the Myra Falls mine, occurs in late veinlets cutting the earlier volcanogenic massive sulphide (VMS) lithologies. The ore mineral assemblage containing the electrum comprises dominantly galena, tennantite, bornite, sphalerite, chalcopyrite, pyrite, and rarely stromeyerite, and is defined as an Au-Zn-Pb-As-Sb association. The gangue is comprised of barite, quartz, and minor feldspathic volcanogenic sedimentary rocks and clay, comprised predominantly of kaolinite with subordinate illite. The deposition of gold as electrum in the baritic upper portions of the sulphide lenses occurs at relatively shallow water depths beneath the sea floor. Primary, pseudosecondary, and secondary fluid inclusions, petrographically related to gold, show boiling fluid inclusion assemblages in the range of 123 to 173 ° C, with compositions and eutectic melt temperatures consistent with seawater at approximately 3.2 wt % NaCl equivalent. The fluid inclusion homogenization temperatures are consistent with boiling seawater corresponding to water depths ranging from 15 to 125 m. Slightly more dilute brines corresponding to salinities of approximately 1 wt % NaCl indicate that there is input from very low-salinity brines, which could represent a transition from subaqueous VMS to epithermal-like conditions for precious metal enrichment, mixing with re-condensed vapor, or very low-salinity igneous fluids.

  • Open Access English
    Authors: 
    Evelina Volpe; Luca Ciabatta; Diana Salciarini; Stefania Camici; Elisabetta Cattoni; Luca Brocca;
    Publisher: Preprints
    Country: Italy

    The development of forecasting models for the evaluation of potential slope instability after rainfall events represents an important issue for the scientific community. This topic has received considerable impetus due to the climate change effect on territories, as several studies demonstrate that an increase in global warming can significantly influence the landslide activity and stability conditions of natural and artificial slopes. A consolidated approach in evaluating rainfall-induced landslide hazard is based on the integration of rainfall forecasts and physically based (PB) predictive models through deterministic laws. However, considering the complex nature of the processes and the high variability of the random quantities involved, probabilistic approaches are recommended in order to obtain reliable predictions. A crucial aspect of the stochastic approach is represented by the definition of appropriate probability density functions (pdfs) to model the uncertainty of the input variables as this may have an important effect on the evaluation of the probability of failure (PoF). The role of the pdf definition on reliability analysis is discussed through a comparison of PoF maps generated using Monte Carlo (MC) simulations performed over a study area located in the Umbria region of central Italy. The study revealed that the use of uniform pdfs for the random input variables, often considered when a detailed geotechnical characterization for the soil is not available, could be inappropriate.

  • Open Access
    Authors: 
    Mariya Shumskayte; Andrey A. Mezin; Elena Chernova; Aleksandra Burukhina; Nikita A. Golikov; Svetlana Melkozerova;
    Publisher: MDPI AG

    This article deals with the topical problem of estimating water content in water–oil mixtures within porous media they saturate, according to low-field NMR relaxometry and dielectric spectroscopy. The aim of the research is experimental validation of the capability of complex data interpretation to acquire information on the filtration-volumetric properties of drill cuttings, relaxation characteristics of oil-containing fluids, the water/oil ratio in water–oil mixtures, and their saturation of drill cuttings to control the composition of liquids produced from boreholes. The studies are carried out on samples of cuttings and oils taken from fields in the Northern regions of the West Siberian oil-and-gas province, where NMR studies have not been performed before. Based on the experimental data obtained, the possibility of water content assessment in water-in-oil mixtures and porous media they saturate were proved through NMR relaxometry. With the use of the proposed methodology, the amount of water in oil–water mixtures was established, and their main NMR characteristics were determined. The relative error in evaluating the proportion of water in mixtures based on high-viscosity oils is less than 10%, and about 20% for those based on light oils. When determining the oil–water ratio in the pore space of the drill cuttings, the error is about 15%. It was proven that joint use of these two techniques makes it possible to increase the reliability of the oil–water ratio assessment of all the samples studied. Furthermore, it was revealed that the NMR spectrum shifts to the right, and the spectrum of the complex permittivity shifts downwards during the transition from high-viscosity oils to light ones.

  • Open Access English
    Authors: 
    Rian A. Engle; Lance D. Yarbrough; Greg Easson;
    Publisher: Preprints

    The Upper Jurassic (Oxfordian Age) Smackover Formation is a significant source for hydrocarbon production in southwest Alabama. Brooklyn Field is in southeast Conecuh County, Alabama, and has been a major producer of oil and natural gas for the state. The Smackover is a carbonate formation that has been divided into seven distinct lithofacies in the Brooklyn and Little Cedar Creek fields. In southwest Alabama, the facies distribution in the Smackover Formation was influenced by paleotopography of the underlying Paleozoic rocks of the Appalachian system. The goal of this study is to determine elemental ratios in rock core within the Smackover Formation using an X-ray fluorescence (XRF) handheld scanner and to correlate these elemental characteristics to the lithofacies of the Smackover Formation in the Brooklyn and Little Cedar Creek fields. Eight wells were used for the study within Brooklyn Field and Little Cedar Creek fields. Cores from the eight wells were scanned at six-inch intervals. Chemical logs were produced to show elemental weights in relation to depth and lithofacies. The chemical signatures within producing zones were correlated to reservoir lithofacies and porosity. Aluminum, silicon, calcium, titanium, and iron were the most significant (> 95% confidence level) predictors of porosity and may be related to the depositional environment and subsequent diageneses of the producing facies. The XRF data suggests relative enrichments in iron, titanium, and potassium. These elements may be related to deposition in relatively restricted marine waters.

  • Open Access
    Authors: 
    mohamed mejri; Maiza Bekara;
    Publisher: MDPI AG

    Seismic imaging is the main technology used for subsurface hydrocarbon prospection. It provides an image of the subsurface using the same principles as ultrasound medical imaging. As for any data acquired through hydrophones (pressure sensors) and/or geophones (velocity/acceleration sensors), the raw seismic data are heavily contaminated with noise and unwanted reflections that need to be removed before further processing. Therefore, the noise attenuation is done at an early stage and often while acquiring the data. Quality control (QC) is mandatory to give confidence in the denoising process and to ensure that a costly data re-acquisition is not needed. QC is done manually by humans and comprises a major portion of the cost of a typical seismic processing project. It is therefore advantageous to automate this process to improve cost and efficiency. Here, we propose a supervised learning approach to build an automatic QC system. The QC system is an attribute-based classifier that is trained to classify three types of filtering (mild = under filtering, noise remaining in the data harsh = over filtering, the signal is distorted). The attributes are computed from the data and represent geophysical and statistical measures of the quality of the filtering. The system is tested on a full-scale survey (9000 km2) to QC the results of the swell noise attenuation process in marine seismic data. optimal = good filtering

  • Open Access English
    Authors: 
    Tumel; Zotova;
    Publisher: MDPI AG

    The diagnosis of the geoecological state of natural landscapes during the economic development of the permafrost zone should be established by assessing destructive cryogenic processes. Furthermore, the geoecological state should be considered in terms of landscape resistance to an increase in cryogenic processes. In this paper, we examine and determine lithocryogenic stability parameters, including permafrost distribution over an area, annual mean temperature, ice content (humidity), and the protective properties of the vegetation. Activation of cryogenic processes in Western Siberia was estimated in terms of the area, development rate and attenuation, natural landscape damage, and hazards to engineering and mining facility operations. The evaluation procedure and the improvement in expert numerical scores are shown. A number of approved methods are proposed for creating assessment maps at various scales using landscape indication methods, decoded satellite images, expert assessments, statistical calculations, and analysis of spatial geographical information systems. Methodical techniques for digital geocryological mapping on the basis of the landscape are presented at scales from 1:3,000,000 to 1:20,000,000. All the maps were created by the authors and can be used for a wide range of applications, including design, survey organizations, and education.

Advanced search in
Research products
arrow_drop_down
Searching FieldsTerms
Any field
arrow_drop_down
includes
arrow_drop_down
Include:
84 Research products, page 1 of 9
  • Open Access
    Authors: 
    Aldo Piombino; Filippo Bernardini; Gregorio Farolfi;
    Publisher: MDPI AG

    Recently, a new strain rate map of Italy and the surrounding areas has been obtained by processing data acquired by the persistent scatterers (PS) of the synthetic aperture radar interferometry (InSAR) satellites—ERS and ENVISAT—between 1990 and 2012. This map clearly shows that there is a link between the strain rate and all the shallow earthquakes (less than 15 km deep) that occurred from 1990 to today, with their epicenters being placed only in high strain rate areas (e.g., Emilia plain, NW Tuscany, Central Apennines). However, the map also presents various regions with high strain rates but in which no damaging earthquakes have occurred since 1990. One of these regions is the Apennine sector, formed by Sannio and Irpinia. This area represents one of the most important seismic districts with a well-known and recorded seismicity from Roman times up to the present day. In our study, we merged historical records with new satellite techniques that allow for the precise determination of ground movements, and then derived physical dimensions, such as strain rate. In this way, we verified that in Irpinia, the occurrence of new strong shocks—forty years after one of the strongest known seismic events in the district that occurred on the 23 November 1980, measuring Mw 6.8—is still a realistic possibility. The reason for this is that, from 1990, only areas characterized by high strain rates have hosted significant earthquakes. This picture has been also confirmed by analyzing the historical catalog of events with seismic completeness for magnitude M ≥ 6 over the last four centuries. It is easy to see that strong seismic events with magnitude M ≥ 6 generally occurred at a relatively short time distance between one another, with a period of 200 years without strong earthquakes between the years 1732 and 1930. This aspect must be considered as very important from various points of view, particularly for civil protection plans, as well as civil engineering and urban planning development.

  • Publication . Preprint . Article . 2019
    Open Access

    4 km southwest of College Park and dissipated near Columbia, Howard County. The supercell tracked approximately 120 km and lasted for about 126 min. This study presents a synoptic and mesoscale overview of favorable conditions and forcing mechanisms that resulted in the severe convective outbreak associated with the College Park tornado. The results show many critical elements of the tornadic event, including a negative-tilted upper-level trough over the Ohio Valley, a jet stream with moderate vertical shear, a low-level warm, moist tongue of the air associated with strong southerly flow over south-central Maryland and Virginia, and significantly increased convective available potential energy (CAPE) during the late afternoon hours. A possible role of the urban heat island effects from Washington, DC, in increasing CAPE for the development of the supercell is discussed. Satellite imagery reveals the banded convective morphology with high cloud tops associated with the supercell that produced the College Park tornado. Operational WSR-88D data exhibit a high reflectivity &ldquo or tornadic debris signature (TDS) within the hook echo, the evolution of the parent storm from a supercell structure to a bow echo, and a tornado cyclone signature (TCS). Many of the mesoscale features could be captured by contemporary numerical model analyses. This study concludes with a discussion of the effectiveness of the coordinated use of satellite and radar observations in the operational environment of nowcasting severe convection. The 24 September 2001 College Park, Maryland, tornado was a long-track and strong tornado that passed within a close range of two Doppler radars. It was the third in a series of three tornadoes associated with a supercell storm that developed in Stafford County, Virginia, and initiated 3&ndash debris ball&rdquo

  • Open Access English
    Authors: 
    Elisabeth Maidl; Matthias Buchecker;
    Publisher: MDPI AG

    meanings of risk actually comprise. To address this gap, we examine the meanings of risk by applying a social representations approach within a qualitative case study research design. Results of the study among inhabitants of Swiss mountain villages show that differences in meanings were found according to hazard experience and community size. We found commonly shared core representations and peripheral ones. We conclude with suggestions on how to make usage of the knowledge on SR in risk communication. understanding. We suggest to instead treat the variety of meanings as a resource for risk communication strategies. However, there is however to date no investigation of what laypersons&rsquo is connoted with divergent meanings in natural hazard risk research and the practice of risk management. Whilst the technical definition is accurately defined, in practice, the term &ldquo . Considering this divergence as a deficiency, risk communication often aims to correct laypersons&rsquo is often synonymously used with &ldquo The term &ldquo danger&rdquo risk&rdquo

  • Open Access English
    Authors: 
    Matteo Gentilucci; Maurizio Barbieri; Peter Burt; Fabrizio D'Aprile;
    Publisher: MDPI
    Countries: United Kingdom, Italy

    This study provides a unique procedure for validating and reconstructing temperature and precipitation data. Although developed from data in Middle Italy, the validation method is intended to be universal, subject to appropriate calibration according to the climate zones analysed. This research is an attempt to create shared applicative procedures that are most of the time only theorized or included in some software without a clear definition of the methods. The purpose is to detect most types of errors according to the procedures for data validation prescribed by the World Meteorological Organization, defining practical operations for each of the five types of data controls: gross error checking, internal consistency check, tolerance test, temporal consistency, and spatial consistency. Temperature and precipitation data over the period 1931&ndash 2014 were investigated. The outcomes of this process have led to the removal of 375 records (0.02%) of temperature data from 40 weather stations and 1286 records (1.67%) of precipitation data from 118 weather stations, and 171 data points reconstructed. In conclusion, this work contributes to the development of standardized methodologies to validate climate data and provides an innovative procedure to reconstruct missing data in the absence of reliable reference time series.

  • Open Access English
    Authors: 
    Daniel D. Marshall; Carol-Anne Nicol; Robert Greene; Rick Sawyer; Armond Stansell; Ross Easterbrook;
    Country: Canada
    Project: NSERC

    Gold, present as electrum, in the Battle Gap, Ridge North-West, HW, and Price deposits at the Myra Falls mine, occurs in late veinlets cutting the earlier volcanogenic massive sulphide (VMS) lithologies. The ore mineral assemblage containing the electrum comprises dominantly galena, tennantite, bornite, sphalerite, chalcopyrite, pyrite, and rarely stromeyerite, and is defined as an Au-Zn-Pb-As-Sb association. The gangue is comprised of barite, quartz, and minor feldspathic volcanogenic sedimentary rocks and clay, comprised predominantly of kaolinite with subordinate illite. The deposition of gold as electrum in the baritic upper portions of the sulphide lenses occurs at relatively shallow water depths beneath the sea floor. Primary, pseudosecondary, and secondary fluid inclusions, petrographically related to gold, show boiling fluid inclusion assemblages in the range of 123 to 173 ° C, with compositions and eutectic melt temperatures consistent with seawater at approximately 3.2 wt % NaCl equivalent. The fluid inclusion homogenization temperatures are consistent with boiling seawater corresponding to water depths ranging from 15 to 125 m. Slightly more dilute brines corresponding to salinities of approximately 1 wt % NaCl indicate that there is input from very low-salinity brines, which could represent a transition from subaqueous VMS to epithermal-like conditions for precious metal enrichment, mixing with re-condensed vapor, or very low-salinity igneous fluids.

  • Open Access English
    Authors: 
    Evelina Volpe; Luca Ciabatta; Diana Salciarini; Stefania Camici; Elisabetta Cattoni; Luca Brocca;
    Publisher: Preprints
    Country: Italy

    The development of forecasting models for the evaluation of potential slope instability after rainfall events represents an important issue for the scientific community. This topic has received considerable impetus due to the climate change effect on territories, as several studies demonstrate that an increase in global warming can significantly influence the landslide activity and stability conditions of natural and artificial slopes. A consolidated approach in evaluating rainfall-induced landslide hazard is based on the integration of rainfall forecasts and physically based (PB) predictive models through deterministic laws. However, considering the complex nature of the processes and the high variability of the random quantities involved, probabilistic approaches are recommended in order to obtain reliable predictions. A crucial aspect of the stochastic approach is represented by the definition of appropriate probability density functions (pdfs) to model the uncertainty of the input variables as this may have an important effect on the evaluation of the probability of failure (PoF). The role of the pdf definition on reliability analysis is discussed through a comparison of PoF maps generated using Monte Carlo (MC) simulations performed over a study area located in the Umbria region of central Italy. The study revealed that the use of uniform pdfs for the random input variables, often considered when a detailed geotechnical characterization for the soil is not available, could be inappropriate.

  • Open Access
    Authors: 
    Mariya Shumskayte; Andrey A. Mezin; Elena Chernova; Aleksandra Burukhina; Nikita A. Golikov; Svetlana Melkozerova;
    Publisher: MDPI AG

    This article deals with the topical problem of estimating water content in water–oil mixtures within porous media they saturate, according to low-field NMR relaxometry and dielectric spectroscopy. The aim of the research is experimental validation of the capability of complex data interpretation to acquire information on the filtration-volumetric properties of drill cuttings, relaxation characteristics of oil-containing fluids, the water/oil ratio in water–oil mixtures, and their saturation of drill cuttings to control the composition of liquids produced from boreholes. The studies are carried out on samples of cuttings and oils taken from fields in the Northern regions of the West Siberian oil-and-gas province, where NMR studies have not been performed before. Based on the experimental data obtained, the possibility of water content assessment in water-in-oil mixtures and porous media they saturate were proved through NMR relaxometry. With the use of the proposed methodology, the amount of water in oil–water mixtures was established, and their main NMR characteristics were determined. The relative error in evaluating the proportion of water in mixtures based on high-viscosity oils is less than 10%, and about 20% for those based on light oils. When determining the oil–water ratio in the pore space of the drill cuttings, the error is about 15%. It was proven that joint use of these two techniques makes it possible to increase the reliability of the oil–water ratio assessment of all the samples studied. Furthermore, it was revealed that the NMR spectrum shifts to the right, and the spectrum of the complex permittivity shifts downwards during the transition from high-viscosity oils to light ones.

  • Open Access English
    Authors: 
    Rian A. Engle; Lance D. Yarbrough; Greg Easson;
    Publisher: Preprints

    The Upper Jurassic (Oxfordian Age) Smackover Formation is a significant source for hydrocarbon production in southwest Alabama. Brooklyn Field is in southeast Conecuh County, Alabama, and has been a major producer of oil and natural gas for the state. The Smackover is a carbonate formation that has been divided into seven distinct lithofacies in the Brooklyn and Little Cedar Creek fields. In southwest Alabama, the facies distribution in the Smackover Formation was influenced by paleotopography of the underlying Paleozoic rocks of the Appalachian system. The goal of this study is to determine elemental ratios in rock core within the Smackover Formation using an X-ray fluorescence (XRF) handheld scanner and to correlate these elemental characteristics to the lithofacies of the Smackover Formation in the Brooklyn and Little Cedar Creek fields. Eight wells were used for the study within Brooklyn Field and Little Cedar Creek fields. Cores from the eight wells were scanned at six-inch intervals. Chemical logs were produced to show elemental weights in relation to depth and lithofacies. The chemical signatures within producing zones were correlated to reservoir lithofacies and porosity. Aluminum, silicon, calcium, titanium, and iron were the most significant (> 95% confidence level) predictors of porosity and may be related to the depositional environment and subsequent diageneses of the producing facies. The XRF data suggests relative enrichments in iron, titanium, and potassium. These elements may be related to deposition in relatively restricted marine waters.

  • Open Access
    Authors: 
    mohamed mejri; Maiza Bekara;
    Publisher: MDPI AG

    Seismic imaging is the main technology used for subsurface hydrocarbon prospection. It provides an image of the subsurface using the same principles as ultrasound medical imaging. As for any data acquired through hydrophones (pressure sensors) and/or geophones (velocity/acceleration sensors), the raw seismic data are heavily contaminated with noise and unwanted reflections that need to be removed before further processing. Therefore, the noise attenuation is done at an early stage and often while acquiring the data. Quality control (QC) is mandatory to give confidence in the denoising process and to ensure that a costly data re-acquisition is not needed. QC is done manually by humans and comprises a major portion of the cost of a typical seismic processing project. It is therefore advantageous to automate this process to improve cost and efficiency. Here, we propose a supervised learning approach to build an automatic QC system. The QC system is an attribute-based classifier that is trained to classify three types of filtering (mild = under filtering, noise remaining in the data harsh = over filtering, the signal is distorted). The attributes are computed from the data and represent geophysical and statistical measures of the quality of the filtering. The system is tested on a full-scale survey (9000 km2) to QC the results of the swell noise attenuation process in marine seismic data. optimal = good filtering

  • Open Access English
    Authors: 
    Tumel; Zotova;
    Publisher: MDPI AG

    The diagnosis of the geoecological state of natural landscapes during the economic development of the permafrost zone should be established by assessing destructive cryogenic processes. Furthermore, the geoecological state should be considered in terms of landscape resistance to an increase in cryogenic processes. In this paper, we examine and determine lithocryogenic stability parameters, including permafrost distribution over an area, annual mean temperature, ice content (humidity), and the protective properties of the vegetation. Activation of cryogenic processes in Western Siberia was estimated in terms of the area, development rate and attenuation, natural landscape damage, and hazards to engineering and mining facility operations. The evaluation procedure and the improvement in expert numerical scores are shown. A number of approved methods are proposed for creating assessment maps at various scales using landscape indication methods, decoded satellite images, expert assessments, statistical calculations, and analysis of spatial geographical information systems. Methodical techniques for digital geocryological mapping on the basis of the landscape are presented at scales from 1:3,000,000 to 1:20,000,000. All the maps were created by the authors and can be used for a wide range of applications, including design, survey organizations, and education.

Send a message
How can we help?
We usually respond in a few hours.