TY - JOUR
T1 - Neural encoding of auditory features during music perception and imagery
AU - Martin, Stephanie
AU - Mikutta, Christian
AU - Leonard, Matthew K.
AU - Hungate, Dylan
AU - Koelsch, Stefan
AU - Shamma, Shihab
AU - Chang, Edward F.
AU - Millán, José Del R.
AU - Knight, Robert T.
AU - Pasley, Brian N.
N1 - Publisher Copyright:
© The Author 2017. Published by Oxford University Press.
PY - 2018/12/1
Y1 - 2018/12/1
N2 - Despite many behavioral and neuroimaging investigations, it remains unclear how the human cortex represents spectrotemporal sound features during auditory imagery, and how this representation compares to auditory perception. To assess this, we recorded electrocorticographic signals from an epileptic patient with proficient music ability in 2 conditions. First, the participant played 2 piano pieces on an electronic piano with the sound volume of the digital keyboard on. Second, the participant replayed the same piano pieces, but without auditory feedback, and the participant was asked to imagine hearing the music in his mind. In both conditions, the sound output of the keyboard was recorded, thus allowing precise time-locking between the neural activity and the spectrotemporal content of the music imagery. This novel task design provided a unique opportunity to apply receptive field modeling techniques to quantitatively study neural encoding during auditory mental imagery. In both conditions, we built encoding models to predict high gamma neural activity (70-150 Hz) from the spectrogram representation of the recorded sound. We found robust spectrotemporal receptive fields during auditory imagery with substantial, but not complete overlap in frequency tuning and cortical location compared to receptive fields measured during auditory perception.
AB - Despite many behavioral and neuroimaging investigations, it remains unclear how the human cortex represents spectrotemporal sound features during auditory imagery, and how this representation compares to auditory perception. To assess this, we recorded electrocorticographic signals from an epileptic patient with proficient music ability in 2 conditions. First, the participant played 2 piano pieces on an electronic piano with the sound volume of the digital keyboard on. Second, the participant replayed the same piano pieces, but without auditory feedback, and the participant was asked to imagine hearing the music in his mind. In both conditions, the sound output of the keyboard was recorded, thus allowing precise time-locking between the neural activity and the spectrotemporal content of the music imagery. This novel task design provided a unique opportunity to apply receptive field modeling techniques to quantitatively study neural encoding during auditory mental imagery. In both conditions, we built encoding models to predict high gamma neural activity (70-150 Hz) from the spectrogram representation of the recorded sound. We found robust spectrotemporal receptive fields during auditory imagery with substantial, but not complete overlap in frequency tuning and cortical location compared to receptive fields measured during auditory perception.
KW - auditory cortex
KW - electrocorticography
KW - frequency tuning
KW - spectrotemporal receptive fields
UR - http://www.scopus.com/inward/record.url?scp=85056252735&partnerID=8YFLogxK
U2 - 10.1093/cercor/bhx277
DO - 10.1093/cercor/bhx277
M3 - Article
C2 - 29088345
AN - SCOPUS:85056252735
SN - 1047-3211
VL - 28
SP - 4222
EP - 4233
JO - Cerebral Cortex
JF - Cerebral Cortex
IS - 12
ER -