Powered by OpenAIRE graph
Found an issue? Give us feedback
image/svg+xml art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos Open Access logo, converted into svg, designed by PLoS. This version with transparent background. http://commons.wikimedia.org/wiki/File:Open_Access_logo_PLoS_white.svg art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos http://www.plos.org/ ZENODOarrow_drop_down
image/svg+xml art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos Open Access logo, converted into svg, designed by PLoS. This version with transparent background. http://commons.wikimedia.org/wiki/File:Open_Access_logo_PLoS_white.svg art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos http://www.plos.org/
ZENODO
Dataset . 2021
License: CC BY
Data sources: Datacite
image/svg+xml art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos Open Access logo, converted into svg, designed by PLoS. This version with transparent background. http://commons.wikimedia.org/wiki/File:Open_Access_logo_PLoS_white.svg art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos http://www.plos.org/
ZENODO
Dataset . 2021
License: CC BY
Data sources: Datacite
image/svg+xml art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos Open Access logo, converted into svg, designed by PLoS. This version with transparent background. http://commons.wikimedia.org/wiki/File:Open_Access_logo_PLoS_white.svg art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos http://www.plos.org/
ZENODO
Dataset . 2021
License: CC BY
Data sources: ZENODO
versions View all 2 versions
addClaim

Extension of the Action Verb Corpus (AVCext)

Authors: Matthias Hirschmanner; Stephanie Gross; Brigitte Krenn; Friedrich Neubarth; Martin Trapp; Michael Zillich; Markus Vincze;

Extension of the Action Verb Corpus (AVCext)

Abstract

Extension of the Action Verb Corpus (AVCext) The extension to the Action Verb Corpus consists of 41 recordings conducted by 2 users experienced with the system performing the same three actions as in AVC — take (208 instances), put (208 instances), and push (91 instances). The actions were performed without any instructions. The focus of the extension is to facilitate action recognition. In comparison to AVC, no speech-related information is annotated in the extension dataset. The other ELAN annotations available with AVC are also available for the extension dataset. Additionally, the actions are annotated in two degrees of granularity. Coarse labels are: take, put and push. Fine labels split the motion into more granular motion primitives: reach, grab, moveObject, and place. These annotations are available as eaf (ELAN) files, csv files and as two separate columns in the Merged files. Details about the collected data can be found in Matthias Hirschmanner, Stephanie Gross, Brigitte Krenn, Friedrich Neubarth, Martin Trapp and Markus Vincze: Extension of the Action Verb Corpus for Supervised Learning. ARW 2018. The dataset consists of the following information: the merged output of the hand and object trackers (AVCExtension_Merged.zip, one csv file per episode/recording) HandID: 0 right, 1 left FingerID: 0 thumb, 1 index, 2 middle, 3 ring, 4 pinky BoneID: 0 metacarpal, 1 proximal, 2 intermediate, 3 distal the output of the object trackers, including the object poses and their reliability estimate calculated by the object tracker, whether an object is touched by or is in the hand of the instructor and whether the object touches the table, for the coordinate system applied see picture "coord_system.png (AVCExtension_Objects.zip, one csv file per episode/recording), the videos from Leap Motion showing the hand movements and objects (AVCExtension_video_libm.zip, one avi file per episode/recording), animation of the merged hand and object tracking (AVCExtension_video_schematic.zip, one avi file per episode/recording), the following list of annotations synchronized with the real-time animation of the hand and object tracking (available as ELAN files AVCExtension_Annotations_eaf.zip, and csv files AVCExtension_Annotations_csv.zip, one file per episode/recording) information which object is currently moved, and where it is moved to (automatically annotated), information whether a hand touches a particular object (manually annotated), information whether a particular object touches the ground/table (automatically annotated), coarse-grained annotation: take, put, push (manually annotated), fine-grained annotation: reach, grab, moveObject, and place (manually annotated), position of the objects in the scene (automatically calculated from output of object tracker) Acknowledgments Corpus creation and annotation was supported by the WWTF project RALLI. The dataset was recorded at ACIN, TUW.

Keywords

action description, dataset, object/action labelling, multimodality

  • BIP!
    Impact byBIP!
    selected citations
    These citations are derived from selected sources.
    This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
    0
    popularity
    This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network.
    Average
    influence
    This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
    Average
    impulse
    This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network.
    Average
    OpenAIRE UsageCounts
    Usage byUsageCounts
    visibility views 8
  • 8
    views
    Powered byOpenAIRE UsageCounts
Powered by OpenAIRE graph
Found an issue? Give us feedback
visibility
selected citations
These citations are derived from selected sources.
This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
BIP!Citations provided by BIP!
popularity
This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network.
BIP!Popularity provided by BIP!
influence
This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
BIP!Influence provided by BIP!
impulse
This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network.
BIP!Impulse provided by BIP!
views
OpenAIRE UsageCountsViews provided by UsageCounts
0
Average
Average
Average
8