Cargando…

Categorizing human vocal signals depends on an integrated auditory‐frontal cortical network

Voice signals are relevant for auditory communication and suggested to be processed in dedicated auditory cortex (AC) regions. While recent reports highlighted an additional role of the inferior frontal cortex (IFC), a detailed description of the integrated functioning of the AC–IFC network and its...

Descripción completa

Detalles Bibliográficos
Autores principales: Roswandowitz, Claudia, Swanborough, Huw, Frühholz, Sascha
Formato: Online Artículo Texto
Lenguaje:English
Publicado: John Wiley & Sons, Inc. 2020
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7927295/
https://www.ncbi.nlm.nih.gov/pubmed/33615612
http://dx.doi.org/10.1002/hbm.25309
Descripción
Sumario:Voice signals are relevant for auditory communication and suggested to be processed in dedicated auditory cortex (AC) regions. While recent reports highlighted an additional role of the inferior frontal cortex (IFC), a detailed description of the integrated functioning of the AC–IFC network and its task relevance for voice processing is missing. Using neuroimaging, we tested sound categorization while human participants either focused on the higher‐order vocal‐sound dimension (voice task) or feature‐based intensity dimension (loudness task) while listening to the same sound material. We found differential involvements of the AC and IFC depending on the task performed and whether the voice dimension was of task relevance or not. First, when comparing neural vocal‐sound processing of our task‐based with previously reported passive listening designs we observed highly similar cortical activations in the AC and IFC. Second, during task‐based vocal‐sound processing we observed voice‐sensitive responses in the AC and IFC whereas intensity processing was restricted to distinct AC regions. Third, the IFC flexibly adapted to the vocal‐sounds' task relevance, being only active when the voice dimension was task relevant. Forth and finally, connectivity modeling revealed that vocal signals independent of their task relevance provided significant input to bilateral AC. However, only when attention was on the voice dimension, we found significant modulations of auditory‐frontal connections. Our findings suggest an integrated auditory‐frontal network to be essential for behaviorally relevant vocal‐sounds processing. The IFC seems to be an important hub of the extended voice network when representing higher‐order vocal objects and guiding goal‐directed behavior.