Cargando…

Categorizing human vocal signals depends on an integrated auditory‐frontal cortical network

Voice signals are relevant for auditory communication and suggested to be processed in dedicated auditory cortex (AC) regions. While recent reports highlighted an additional role of the inferior frontal cortex (IFC), a detailed description of the integrated functioning of the AC–IFC network and its...

Descripción completa

Detalles Bibliográficos
Autores principales: Roswandowitz, Claudia, Swanborough, Huw, Frühholz, Sascha
Formato: Online Artículo Texto
Lenguaje:English
Publicado: John Wiley & Sons, Inc. 2020
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7927295/
https://www.ncbi.nlm.nih.gov/pubmed/33615612
http://dx.doi.org/10.1002/hbm.25309
_version_ 1783659649071841280
author Roswandowitz, Claudia
Swanborough, Huw
Frühholz, Sascha
author_facet Roswandowitz, Claudia
Swanborough, Huw
Frühholz, Sascha
author_sort Roswandowitz, Claudia
collection PubMed
description Voice signals are relevant for auditory communication and suggested to be processed in dedicated auditory cortex (AC) regions. While recent reports highlighted an additional role of the inferior frontal cortex (IFC), a detailed description of the integrated functioning of the AC–IFC network and its task relevance for voice processing is missing. Using neuroimaging, we tested sound categorization while human participants either focused on the higher‐order vocal‐sound dimension (voice task) or feature‐based intensity dimension (loudness task) while listening to the same sound material. We found differential involvements of the AC and IFC depending on the task performed and whether the voice dimension was of task relevance or not. First, when comparing neural vocal‐sound processing of our task‐based with previously reported passive listening designs we observed highly similar cortical activations in the AC and IFC. Second, during task‐based vocal‐sound processing we observed voice‐sensitive responses in the AC and IFC whereas intensity processing was restricted to distinct AC regions. Third, the IFC flexibly adapted to the vocal‐sounds' task relevance, being only active when the voice dimension was task relevant. Forth and finally, connectivity modeling revealed that vocal signals independent of their task relevance provided significant input to bilateral AC. However, only when attention was on the voice dimension, we found significant modulations of auditory‐frontal connections. Our findings suggest an integrated auditory‐frontal network to be essential for behaviorally relevant vocal‐sounds processing. The IFC seems to be an important hub of the extended voice network when representing higher‐order vocal objects and guiding goal‐directed behavior.
format Online
Article
Text
id pubmed-7927295
institution National Center for Biotechnology Information
language English
publishDate 2020
publisher John Wiley & Sons, Inc.
record_format MEDLINE/PubMed
spelling pubmed-79272952021-03-12 Categorizing human vocal signals depends on an integrated auditory‐frontal cortical network Roswandowitz, Claudia Swanborough, Huw Frühholz, Sascha Hum Brain Mapp Research Articles Voice signals are relevant for auditory communication and suggested to be processed in dedicated auditory cortex (AC) regions. While recent reports highlighted an additional role of the inferior frontal cortex (IFC), a detailed description of the integrated functioning of the AC–IFC network and its task relevance for voice processing is missing. Using neuroimaging, we tested sound categorization while human participants either focused on the higher‐order vocal‐sound dimension (voice task) or feature‐based intensity dimension (loudness task) while listening to the same sound material. We found differential involvements of the AC and IFC depending on the task performed and whether the voice dimension was of task relevance or not. First, when comparing neural vocal‐sound processing of our task‐based with previously reported passive listening designs we observed highly similar cortical activations in the AC and IFC. Second, during task‐based vocal‐sound processing we observed voice‐sensitive responses in the AC and IFC whereas intensity processing was restricted to distinct AC regions. Third, the IFC flexibly adapted to the vocal‐sounds' task relevance, being only active when the voice dimension was task relevant. Forth and finally, connectivity modeling revealed that vocal signals independent of their task relevance provided significant input to bilateral AC. However, only when attention was on the voice dimension, we found significant modulations of auditory‐frontal connections. Our findings suggest an integrated auditory‐frontal network to be essential for behaviorally relevant vocal‐sounds processing. The IFC seems to be an important hub of the extended voice network when representing higher‐order vocal objects and guiding goal‐directed behavior. John Wiley & Sons, Inc. 2020-12-08 /pmc/articles/PMC7927295/ /pubmed/33615612 http://dx.doi.org/10.1002/hbm.25309 Text en © 2020 The Authors. Human Brain Mapping published by Wiley Periodicals LLC. This is an open access article under the terms of the http://creativecommons.org/licenses/by-nc/4.0/ License, which permits use, distribution and reproduction in any medium, provided the original work is properly cited and is not used for commercial purposes.
spellingShingle Research Articles
Roswandowitz, Claudia
Swanborough, Huw
Frühholz, Sascha
Categorizing human vocal signals depends on an integrated auditory‐frontal cortical network
title Categorizing human vocal signals depends on an integrated auditory‐frontal cortical network
title_full Categorizing human vocal signals depends on an integrated auditory‐frontal cortical network
title_fullStr Categorizing human vocal signals depends on an integrated auditory‐frontal cortical network
title_full_unstemmed Categorizing human vocal signals depends on an integrated auditory‐frontal cortical network
title_short Categorizing human vocal signals depends on an integrated auditory‐frontal cortical network
title_sort categorizing human vocal signals depends on an integrated auditory‐frontal cortical network
topic Research Articles
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7927295/
https://www.ncbi.nlm.nih.gov/pubmed/33615612
http://dx.doi.org/10.1002/hbm.25309
work_keys_str_mv AT roswandowitzclaudia categorizinghumanvocalsignalsdependsonanintegratedauditoryfrontalcorticalnetwork
AT swanboroughhuw categorizinghumanvocalsignalsdependsonanintegratedauditoryfrontalcorticalnetwork
AT fruhholzsascha categorizinghumanvocalsignalsdependsonanintegratedauditoryfrontalcorticalnetwork