Powered by OpenAIRE graph
Found an issue? Give us feedback
image/svg+xml art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos Open Access logo, converted into svg, designed by PLoS. This version with transparent background. http://commons.wikimedia.org/wiki/File:Open_Access_logo_PLoS_white.svg art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos http://www.plos.org/ Robotics and Autonom...arrow_drop_down
image/svg+xml art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos Open Access logo, converted into svg, designed by PLoS. This version with transparent background. http://commons.wikimedia.org/wiki/File:Open_Access_logo_PLoS_white.svg art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos http://www.plos.org/
Robotics and Autonomous Systems
Article
License: CC BY ND
Data sources: UnpayWall
image/svg+xml art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos Open Access logo, converted into svg, designed by PLoS. This version with transparent background. http://commons.wikimedia.org/wiki/File:Open_Access_logo_PLoS_white.svg art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos http://www.plos.org/
image/svg+xml art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos Open Access logo, converted into svg, designed by PLoS. This version with transparent background. http://commons.wikimedia.org/wiki/File:Open_Access_logo_PLoS_white.svg art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos http://www.plos.org/
image/svg+xml Jakob Voss, based on art designer at PLoS, modified by Wikipedia users Nina and Beao Closed Access logo, derived from PLoS Open Access logo. This version with transparent background. http://commons.wikimedia.org/wiki/File:Closed_Access_logo_transparent.svg Jakob Voss, based on art designer at PLoS, modified by Wikipedia users Nina and Beao
Robotics and Autonomous Systems
Article . 2019 . Peer-reviewed
License: Elsevier TDM
Data sources: Crossref
DBLP
Article . 2025
Data sources: DBLP
http://dx.doi.org/10.1016/j.ro...
Article
License: Elsevier TDM
Data sources: Sygma
versions View all 6 versions
addClaim

Context-based affordance segmentation from 2D images for robot actions

Authors: Lüddecke, Timo; Kulvicius, Tomas; Wörgötter, Florentin;

Context-based affordance segmentation from 2D images for robot actions

Abstract

Abstract Affordances play a crucial role in robotics since they allow developing truly autonomous robots, which can freely explore and interact with the environment. Most of the existing approaches for analyzing affordances in a scene consider only one or few types of affordance, e.g., grasping points, object manipulation or locomotion. In many cases only whole objects are considered. In our study we include in total 12 affordances of object-related, manipulation and locomotion affordances, considering affordances of both objects and/or their parts. We design a system that can densely predict affordances given only a single 2D RGB image. For this, we propose a method that transfers object class labels to affordances. This enables us to train convolutional neural networks, a PSPNet-based network and a U-Net-style network, to directly predict affordances from an image using a selective binary cross entropy loss function. The method is able to handle (potentially multiple) affordances of objects and their parts in a pixel-wise manner even in the case of incomplete data. We perform qualitative as well as quantitative evaluations with simulated and real data including robot experiments. In general, we find that frequent affordances are recognized with a substantial fraction of correctly assigned pixels, while this is harder for infrequent affordances and small objects. In addition, we demonstrate that our method performs better than a recent competitive approach. As the proposed method operates on 2D images, it is easier to implement than competing 3D methods and it could therefore more easily provide useful affordance estimates for robotic actions as demonstrated experimentally.

Country
Germany
Related Organizations
Keywords

deep learning, affordance, robotic action

  • BIP!
    Impact byBIP!
    selected citations
    These citations are derived from selected sources.
    This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
    17
    popularity
    This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network.
    Top 10%
    influence
    This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
    Top 10%
    impulse
    This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network.
    Top 10%
    OpenAIRE UsageCounts
    Usage byUsageCounts
    visibility views 5
    download downloads 22
  • 5
    views
    22
    downloads
    Powered byOpenAIRE UsageCounts
Powered by OpenAIRE graph
Found an issue? Give us feedback
visibility
download
selected citations
These citations are derived from selected sources.
This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
BIP!Citations provided by BIP!
popularity
This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network.
BIP!Popularity provided by BIP!
influence
This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
BIP!Influence provided by BIP!
impulse
This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network.
BIP!Impulse provided by BIP!
views
OpenAIRE UsageCountsViews provided by UsageCounts
downloads
OpenAIRE UsageCountsDownloads provided by UsageCounts
17
Top 10%
Top 10%
Top 10%
5
22
Green
hybrid