Powered by OpenAIRE graph
Found an issue? Give us feedback
image/svg+xml art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos Open Access logo, converted into svg, designed by PLoS. This version with transparent background. http://commons.wikimedia.org/wiki/File:Open_Access_logo_PLoS_white.svg art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos http://www.plos.org/ ZENODOarrow_drop_down
image/svg+xml art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos Open Access logo, converted into svg, designed by PLoS. This version with transparent background. http://commons.wikimedia.org/wiki/File:Open_Access_logo_PLoS_white.svg art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos http://www.plos.org/
ZENODO
Dataset . 2024
Data sources: ZENODO
addClaim

This Research product is the result of merged Research products in OpenAIRE.

You have already added 0 works in your ORCID record related to the merged Research product.

Machine Learning-based Human Life Detection Behind Walls Exploiting a UWB Radar Sensor

Authors: Dimitris Uzunidis; Panagiotis Kasnesis; Charalampos Z. Patrikakis; Stelios A. Mitilineos;

Machine Learning-based Human Life Detection Behind Walls Exploiting a UWB Radar Sensor

Abstract

The "Signs of life detection behind obstacles using a UWB radar sensor" dataset was created by the University of West Attica, by collecting data from 9 people using a UWB sensor, which was the X4M200 UWB radar sensor by Novelda, with the existence of a wall between the subject and the radar for the purposes of RESCUER project (https://rescuerproject.eu/ - a Horizon 2020 Research & Innovation Programme under Grant Agreement No.101021836). Two datasets were collected. The first includes data recorded from standing humans positioned behind a wall of 30 cm thickness, and the second incorporates data gathered from lying down humans positioned behind a wall 20 cm thick and distant from the radar between 1 and 4 m. In both datasets, the BioHarness Zephyr 3 belt that estimates the breathing rate of the victims was exploited to provide the ground truth measurements of the breathing rate in each data collection session. Each session lasted about 90 seconds, and the belt provided the breathing rate every second. To obtain reliable measurements, we averaged the breathing rate after the first 60 seconds to acquire a single value for all samples of the same data collection session. Next, the human was moved 0.5 m away from its current position to gather data from the next session. This procedure was followed until the human reached a 4 m distance from the radar, allowing the acquisition of data from a total of 7 sessions (from 1 to 4 m). Additionally, we collected a number of data from the same places without any human presence to give the opportunity to train Machine Learning (ML) algorithms to detect human presence behind the wall. In particular, the number of sessions without human presence is 14 for the dataset including standing persons, and 10 for the dataset incorporating lying down persons. To facilitate the ML training using more samples, apart from the raw data, the dataset also contains overlapped data. We considered samples of 20-sec duration, exploiting an overlap of 15 sec over this 20-sec window. The labels for distance and breathing rate for the overlapped data are in the corresponding files. For the non-overlapped data, the file name contains the two labels, e.g. the file "clean 1_10.85.csv" denotes a distance of 1 m between the human and the radar and a breathing rate of the human equal to 10.85 breaths/min. Finally, the sampling rate of the radar was set to 17 samples per second and a distance step of about 0.05144 m was also considered. As a consequence the first row of each file denotes the different distances (over the range of 0.48 m to 6 m).

Related Organizations
  • BIP!
    Impact byBIP!
    citations
    This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
    0
    popularity
    This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network.
    Average
    influence
    This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
    Average
    impulse
    This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network.
    Average
Powered by OpenAIRE graph
Found an issue? Give us feedback
citations
This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
BIP!Citations provided by BIP!
popularity
This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network.
BIP!Popularity provided by BIP!
influence
This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
BIP!Influence provided by BIP!
impulse
This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network.
BIP!Impulse provided by BIP!
0
Average
Average
Average