Powered by OpenAIRE graph
Found an issue? Give us feedback
image/svg+xml art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos Open Access logo, converted into svg, designed by PLoS. This version with transparent background. http://commons.wikimedia.org/wiki/File:Open_Access_logo_PLoS_white.svg art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos http://www.plos.org/ ZENODOarrow_drop_down
image/svg+xml art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos Open Access logo, converted into svg, designed by PLoS. This version with transparent background. http://commons.wikimedia.org/wiki/File:Open_Access_logo_PLoS_white.svg art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos http://www.plos.org/
ZENODO
Dataset . 2024
License: CC 0
Data sources: ZENODO
DRYAD
Dataset . 2024
License: CC 0
Data sources: Datacite
versions View all 2 versions
addClaim

Repeatedly experiencing the McGurk effect induces long-lasting changes in auditory speech perception

Authors: Magnotti, John; Beauchamp, Michael;

Repeatedly experiencing the McGurk effect induces long-lasting changes in auditory speech perception

Abstract

# Repeatedly experiencing the McGurk effect induces long-lasting changes in auditory speech perception These data are individual participant reports during presentation of speech stimuli. ## Description of the data and file structure The data is contained in the file **SummaryData.xlsx** There are two worksheets. The "**batch1**" sheet contains data from the main experiment. The "**mturk**" sheet contains the data from the replication study. In each sheet, one row corresponds to the responses to a particular stimulus on a particular data, aggregated across repeated presentations of that stimulus (if applicable). In both sheets, the first column is the participant ID (one ID per participant) and the second column is the name of the stimulus file. the "day_type" column indicates pre-test, training, post-test, or long-term post-test. the "time" column indicates the experimental day. "stimulusPresentation" is A for auditory-only, AV for audiovisual. "stimulusType" refers to auditory and visual congruency (congruent or McGurk) "n_responses" is the total number of responses for that stimulus on that data. "resp_X" is the number of each type of response, where A is auditory, F is fusion, O is other, V is visual "stimulusPaperName" is identical to the stimulus name except for some stimuli that are referred to in the paper with an abbreviation (e.g. S1) "training" refers to whether a stimulus was presented on training days. ## Sharing/Access information The stimuli presented to the participants are available at [https://openwetware.org/wiki/Beauchamp:Stimuli](https://openwetware.org/wiki/Beauchamp:Stimuli) ## Code/Software The R code necessary to analyze the data and produce manuscript results are in the compiled R Markdown file full_results.html

In the McGurk effect, presentation of incongruent auditory and visual speech evokes a fusion percept different than either component modality. We show that repeatedly experiencing the McGurk effect for 14 days induces a change in auditory-only speech perception: the auditory component of the McGurk stimulus begins to evoke the fusion percept, even when presented on its own without accompanying visual speech. This perceptual change, termed fusion-induced recalibration (FIR), was talker-specific and syllable-specific and persisted for a year or more in some participants without any additional McGurk exposure. Participants who did not experience the McGurk effect did not experience FIR, showing that recalibration was driven by multisensory prediction error. A causal inference model of speech perception incorporating multisensory cue conflict accurately predicted individual differences in FIR. Just as the McGurk effect demonstrates that visual speech can alter the perception of auditory speech, FIR shows that these alterations can persist for months or years. The ability to induce seemingly permanent changes in auditory speech perception will be useful for studying plasticity in brain networks for language and may provide new strategies for improving language learning.

Related Organizations
Keywords

FOS: Psychology, Multisensory, Speech, Perception, Visual, Auditory

  • BIP!
    Impact byBIP!
    selected citations
    These citations are derived from selected sources.
    This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
    1
    popularity
    This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network.
    Average
    influence
    This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
    Average
    impulse
    This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network.
    Average
Powered by OpenAIRE graph
Found an issue? Give us feedback
selected citations
These citations are derived from selected sources.
This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
BIP!Citations provided by BIP!
popularity
This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network.
BIP!Popularity provided by BIP!
influence
This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
BIP!Influence provided by BIP!
impulse
This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network.
BIP!Impulse provided by BIP!
1
Average
Average
Average