
Significance We demonstrate that cortical oscillatory activity in both low (<8 Hz) and high (15–30 Hz) frequencies is tightly coupled to behavioral performance in musical listening, in a bidirectional manner. In light of previous work on speech, we propose a framework in which the brain exploits the temporal regularities in music to accurately parse individual notes from the sound stream using lower frequencies (entrainment) and in higher frequencies to generate temporal and content-based predictions of subsequent note events associated with predictive models.
Cerebral Cortex, Neuronal Plasticity, Time Factors, Acoustic Stimulation, Auditory Perception, Magnetoencephalography, Models, Psychological, Pitch Perception, Music
Cerebral Cortex, Neuronal Plasticity, Time Factors, Acoustic Stimulation, Auditory Perception, Magnetoencephalography, Models, Psychological, Pitch Perception, Music
| citations This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | 262 | |
| popularity This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network. | Top 1% | |
| influence This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | Top 10% | |
| impulse This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network. | Top 1% |
