Powered by OpenAIRE graph
Found an issue? Give us feedback
image/svg+xml art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos Open Access logo, converted into svg, designed by PLoS. This version with transparent background. http://commons.wikimedia.org/wiki/File:Open_Access_logo_PLoS_white.svg art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos http://www.plos.org/ ZENODOarrow_drop_down
image/svg+xml art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos Open Access logo, converted into svg, designed by PLoS. This version with transparent background. http://commons.wikimedia.org/wiki/File:Open_Access_logo_PLoS_white.svg art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos http://www.plos.org/
ZENODO
Dataset . 2019
License: CC BY
Data sources: ZENODO
image/svg+xml art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos Open Access logo, converted into svg, designed by PLoS. This version with transparent background. http://commons.wikimedia.org/wiki/File:Open_Access_logo_PLoS_white.svg art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos http://www.plos.org/
ZENODO
Dataset . 2019
License: CC BY
Data sources: ZENODO; Datacite
image/svg+xml art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos Open Access logo, converted into svg, designed by PLoS. This version with transparent background. http://commons.wikimedia.org/wiki/File:Open_Access_logo_PLoS_white.svg art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos http://www.plos.org/
ZENODO
Dataset . 2019
License: CC BY
Data sources: Datacite
image/svg+xml art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos Open Access logo, converted into svg, designed by PLoS. This version with transparent background. http://commons.wikimedia.org/wiki/File:Open_Access_logo_PLoS_white.svg art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos http://www.plos.org/
ZENODO
Dataset . 2019
License: CC BY
Data sources: Datacite; ZENODO
versions View all 3 versions
addClaim

This Research product is the result of merged Research products in OpenAIRE.

You have already added 0 works in your ORCID record related to the merged Research product.

Wrist-mounted IMU data towards the investigation of free-living human eating behavior - the Free-living Food Intake Cycle (FreeFIC) dataset

Authors: Kyritsis, Konstantinos; Diou, Christos; Delopoulos, Anastasios;

Wrist-mounted IMU data towards the investigation of free-living human eating behavior - the Free-living Food Intake Cycle (FreeFIC) dataset

Abstract

Introduction The Free-living Food Intake Cycle (FreeFIC) dataset was created by the Multimedia Understanding Group towards the investigation of in-the-wild eating behavior. This is achieved by recording the subjects’ meals as a small part part of their everyday life, unscripted, activities. The FreeFIC dataset contains the \(3D\) acceleration and orientation velocity signals (\(6\) DoF) from \(22\) in-the-wild sessions provided by \(12\) unique subjects. All sessions were recorded using a commercial smartwatch (\(6\) using the Huawei Watch 2™ and the MobVoi TicWatch™ for the rest) while the participants performed their everyday activities. In addition, FreeFIC also contains the start and end moments of each meal session as reported by the participants. Description FreeFIC includes \(22\) in-the-wild sessions that belong to \(12\) unique subjects. Participants were instructed to wear the smartwatch to the hand of their preference well ahead before any meal and continue to wear it throughout the day until the battery is depleted. In addition, we followed a self-report labeling model, meaning that the ground truth is provided from the participant by documenting the start and end moments of their meals to the best of their abilities as well as the hand they wear the smartwatch on. The total duration of the \(22\) recordings sums up to \(112.71\) hours, with a mean duration of \(5.12\) hours. Additional data statistics can be obtained by executing the provided python script stats_dataset.py. Furthermore, the accompanying python script viz_dataset.py will visualize the IMU signals and ground truth intervals for each of the recordings. Information on how to execute the Python scripts can be found below. # The script(s) and the pickle file must be located in the same directory. # Tested with Python 3.6.4 # Requirements: Numpy, Pickle and Matplotlib # Calculate and echo dataset statistics $ python stats_dataset.py # Visualize signals and ground truth $ python viz_dataset.py FreeFIC is also tightly related to Food Intake Cycle (FIC), a dataset we created in order to investigate the in-meal eating behavior. More information about FIC can be found here and here. Publications If you plan to use the FreeFIC dataset or any of the resources found in this page, please cite our work: @article{kyritsis2020data, title={A Data Driven End-to-end Approach for In-the-wild Monitoring of Eating Behavior Using Smartwatches}, author={Kyritsis, Konstantinos and Diou, Christos and Delopoulos, Anastasios}, journal={IEEE Journal of Biomedical and Health Informatics}, year={2020}, publisher={IEEE}} @inproceedings{kyritsis2017automated, title={Detecting Meals In the Wild Using the Inertial Data of a Typical Smartwatch}, author={Kyritsis, Konstantinos and Diou, Christos and Delopoulos, Anastasios}, booktitle={2019 41th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC)}, year={2019}, organization={IEEE}} Technical details We provide the FreeFIC dataset as a pickle. The file can be loaded using Python in the following way: import pickle as pkl import numpy as np with open('./FreeFIC_FreeFIC-heldout.pkl','rb') as fh: dataset = pkl.load(fh) The dataset variable in the snipet above is a dictionary with \(5\) keys. Namely: 'subject_id' 'session_id' 'signals_raw' 'signals_proc' 'meal_gt' The contents under a specific key can be obtained by: sub = dataset['subject_id'] # for the subject id ses = dataset['session_id'] # for the session id raw = dataset['signals_raw'] # for the raw IMU signals proc = dataset['signals_proc'] # for the processed IMU signals gt = dataset['meal_gt'] # for the meal ground truth The sub, ses, raw, proc and gt variables in the snipet above are lists with a length equal to \(22\). Elements across all lists are aligned; e.g., the \(3\)rd element of the list under the 'session_id' key corresponds to the \(3\)rd element of the list under the 'signals_proc' key. sub: list Each element of the sub list is a scalar (integer) that corresponds to the unique identifier of the subject that can take the following values: \([1, 2, 3, 4, 13, 14, 15, 16, 17, 18, 19, 20]\). It should be emphasized that the subjects with ids \(15, 16, 17, 18, 19\) and \(20\) belong to the held-out part of the FreeFIC dataset (more information can be found in \( \)the publication titled "A Data Driven End-to-end Approach for In-the-wild Monitoring of Eating Behavior Using Smartwatches" by Kyritsis et al). Moreover, the subject identifier in FreeFIC is in-line with the subject identifier in the FIC dataset (more info here and here); i.e., FIC’s subject with id equal to \(2\) is the same person as FreeFIC’s subject with id equal to \(2\). ses: list Each element of this list is a scalar (integer) that corresponds to the unique identifier of the session that can range between \(1\) and \(5\). It should be noted that not all subjects have the same number of sessions. raw: list Each element of this list is dictionary with the 'acc' and 'gyr' keys. The data under the 'acc' key is a \(N_{acc} \times 4\) numpy.ndarray that contains the timestamps in seconds (first column) and the \(3D\) raw accelerometer measurements in \(g\) (second, third and forth columns - representing the \(x, y \) and \(z\) axis, respectively). The data under the 'gyr' key is a \(N_{gyr} \times 4\) numpy.ndarray that contains the timestamps in seconds (first column) and the \(3D\) raw gyroscope measurements in \({degrees}/{second}\)(second, third and forth columns - representing the \(x, y \) and \(z\) axis, respectively). All sensor streams are transformed in such a way that reflects all participants wearing the smartwatch at the same hand with the same orientation, thusly achieving data uniformity. This transformation is in par with the signals in the FIC dataset (more info here and here). Finally, the length of the raw accelerometer and gyroscope numpy.ndarrays is different \((N_{acc} \neq N_{gyr})\). This behavior is predictable and is caused by the Android platform. proc: list Each element of this list is an \(M\times7\) numpy.ndarray that contains the timestamps, \(3D\) accelerometer and gyroscope measurements for each meal. Specifically, the first column contains the timestamps in seconds, the second, third and forth columns contain the \(x,y\) and \(z\) accelerometer values in \(g\) and the fifth, sixth and seventh columns contain the \(x,y\) and \(z\) gyroscope values in \({degrees}/{second}\). Unlike elements in the raw list, processed measurements (in the proc list) have a constant sampling rate of \(100\) Hz and the accelerometer/gyroscope measurements are aligned with each other. In addition, all sensor streams are transformed in such a way that reflects all participants wearing the smartwatch at the same hand with the same orientation, thusly achieving data uniformity. This transformation is in par with the signals in the FIC dataset (more info here and here). No other preprocessing is performed on the data; e.g., the acceleration component due to the Earth's gravitational field is present at the processed acceleration measurements. The potential researcher can consult the article "A Data Driven End-to-end Approach for In-the-wild Monitoring of Eating Behavior Using Smartwatches" by Kyritsis et al. on how to further preprocess the IMU signals (i.e., smooth and remove the gravitational component). meal_gt: list Each element of this list is a \(K\times2\) matrix. Each row represents the meal intervals for the specific in-the-wild session. The first column contains the timestamps of the meal start moments whereas the second one the timestamps of the meal end moments. All timestamps are in seconds. The number of meals \(K\) varies across recordings (e.g., a recording exist where a participant consumed two meals). Ethics and funding Informed consent, including permission for third-party access to anonymised data, was obtained from all subjects prior to their engagement in the study. The work has received funding from the European Union's Horizon 2020 research and innovation programme under Grant Agreement No 727688 - BigO: Big data against childhood obesity. Contact Any inquiries regarding the FreeFIC dataset should be addressed to: Dr. Konstantinos KYRITSIS Multimedia Understanding Group (MUG) Department of Electrical & Computer Engineering Aristotle University of Thessaloniki University Campus, Building C, 3rd floor Thessaloniki, Greece, GR54124 Tel: +30 2310 996359, 996365 Fax: +30 2310 996398 E-mail: kokirits [at] mug [dot] ee [dot] auth [dot] gr

Keywords

gyroscope, food intake, free-living, FreeFIC, inertial, eating behavior, wrist-mounted, IMU, smartwatch, in-the-wild, accelerometer, food intake cycle

  • BIP!
    Impact byBIP!
    citations
    This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
    0
    popularity
    This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network.
    Average
    influence
    This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
    Average
    impulse
    This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network.
    Average
    OpenAIRE UsageCounts
    Usage byUsageCounts
    visibility views 686
    download downloads 216
  • 686
    views
    216
    downloads
    Powered byOpenAIRE UsageCounts
Powered by OpenAIRE graph
Found an issue? Give us feedback
visibility
download
citations
This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
BIP!Citations provided by BIP!
popularity
This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network.
BIP!Popularity provided by BIP!
influence
This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
BIP!Influence provided by BIP!
impulse
This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network.
BIP!Impulse provided by BIP!
views
OpenAIRE UsageCountsViews provided by UsageCounts
downloads
OpenAIRE UsageCountsDownloads provided by UsageCounts
0
Average
Average
Average
686
216