Powered by OpenAIRE graph
Found an issue? Give us feedback
image/svg+xml art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos Open Access logo, converted into svg, designed by PLoS. This version with transparent background. http://commons.wikimedia.org/wiki/File:Open_Access_logo_PLoS_white.svg art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos http://www.plos.org/ bioRxivarrow_drop_down
image/svg+xml art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos Open Access logo, converted into svg, designed by PLoS. This version with transparent background. http://commons.wikimedia.org/wiki/File:Open_Access_logo_PLoS_white.svg art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos http://www.plos.org/
image/svg+xml art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos Open Access logo, converted into svg, designed by PLoS. This version with transparent background. http://commons.wikimedia.org/wiki/File:Open_Access_logo_PLoS_white.svg art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos http://www.plos.org/
image/svg+xml art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos Open Access logo, converted into svg, designed by PLoS. This version with transparent background. http://commons.wikimedia.org/wiki/File:Open_Access_logo_PLoS_white.svg art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos http://www.plos.org/
NeuroImage
Article . 2019 . Peer-reviewed
License: Elsevier TDM
Data sources: Crossref
image/svg+xml art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos Open Access logo, converted into svg, designed by PLoS. This version with transparent background. http://commons.wikimedia.org/wiki/File:Open_Access_logo_PLoS_white.svg art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos http://www.plos.org/
NeuroImage
Article
Data sources: UnpayWall
image/svg+xml art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos Open Access logo, converted into svg, designed by PLoS. This version with transparent background. http://commons.wikimedia.org/wiki/File:Open_Access_logo_PLoS_white.svg art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos http://www.plos.org/
NeuroImage
Article . 2020
https://doi.org/10.1101/607499...
Article . 2019 . Peer-reviewed
Data sources: Crossref
Open Science Framework
Other literature type . 2019
Data sources: Datacite
DBLP
Article . 2020
Data sources: DBLP
versions View all 7 versions
addClaim

Untangling featural and conceptual object representations

Authors: Tijl Grootswagers; Amanda K. Robinson; Sophia M. Shatek; Thomas A. Carlson;

Untangling featural and conceptual object representations

Abstract

Abstract How are visual inputs transformed into conceptual representations by the human visual system? The contents of human perception, such as objects presented on a visual display, can reliably be decoded from voxel activation patterns in fMRI, and in evoked sensor activations in MEG and EEG. A prevailing question is the extent to which brain activation associated with object categories is due to statistical regularities of visual features within object categories. Here, we assessed the contribution of mid-level features to conceptual category decoding using EEG and a novel fast periodic decoding paradigm. Our study used a stimulus set consisting of intact objects from the animate (e.g., fish) and inanimate categories (e.g., chair) and scrambled versions of the same objects that were unrecognizable and preserved their visual features (Long, Yu, & Konkle, 2018). By presenting the images at different periodic rates, we biased processing to different levels of the visual hierarchy. We found that scrambled objects and their intact counterparts elicited similar patterns of activation, which could be used to decode the conceptual category (animate or inanimate), even for the unrecognizable scrambled objects. Animacy decoding for the scrambled objects, however, was only possible at the slowest periodic presentation rate. Animacy decoding for intact objects was faster, more robust, and could be achieved at faster presentation rates. Our results confirm that the mid-level visual features preserved in the scrambled objects contribute to animacy decoding, but also demonstrate that the dynamics vary markedly for intact versus scrambled objects. Our findings suggest a complex interplay between visual feature coding and categorical representations that is mediated by the visual system’s capacity to use image features to resolve a recognisable object.

Keywords

Adult, 121, Adolescent, brain, Electroencephalography, Recognition, Psychology, Signal Processing, Computer-Assisted, Middle Aged, visual pathways, Young Adult, Pattern Recognition, Visual, XXXXXX - Unknown, magnetic resonance imaging, Humans, Female, Visual Cortex

  • BIP!
    Impact byBIP!
    selected citations
    These citations are derived from selected sources.
    This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
    42
    popularity
    This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network.
    Top 10%
    influence
    This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
    Top 10%
    impulse
    This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network.
    Top 10%
Powered by OpenAIRE graph
Found an issue? Give us feedback
selected citations
These citations are derived from selected sources.
This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
BIP!Citations provided by BIP!
popularity
This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network.
BIP!Popularity provided by BIP!
influence
This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
BIP!Influence provided by BIP!
impulse
This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network.
BIP!Impulse provided by BIP!
42
Top 10%
Top 10%
Top 10%
Green
gold