Cargando…

Impact of language on functional connectivity for audiovisual speech integration

Visual information about lip and facial movements plays a role in audiovisual (AV) speech perception. Although this has been widely confirmed, previous behavioural studies have shown interlanguage differences, that is, native Japanese speakers do not integrate auditory and visual speech as closely a...

Descripción completa

Detalles Bibliográficos
Autores principales: Shinozaki, Jun, Hiroe, Nobuo, Sato, Masa-aki, Nagamine, Takashi, Sekiyama, Kaoru
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Nature Publishing Group 2016
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4980767/
https://www.ncbi.nlm.nih.gov/pubmed/27510407
http://dx.doi.org/10.1038/srep31388
_version_ 1782447511326162944
author Shinozaki, Jun
Hiroe, Nobuo
Sato, Masa-aki
Nagamine, Takashi
Sekiyama, Kaoru
author_facet Shinozaki, Jun
Hiroe, Nobuo
Sato, Masa-aki
Nagamine, Takashi
Sekiyama, Kaoru
author_sort Shinozaki, Jun
collection PubMed
description Visual information about lip and facial movements plays a role in audiovisual (AV) speech perception. Although this has been widely confirmed, previous behavioural studies have shown interlanguage differences, that is, native Japanese speakers do not integrate auditory and visual speech as closely as native English speakers. To elucidate the neural basis of such interlanguage differences, 22 native English speakers and 24 native Japanese speakers were examined in behavioural or functional Magnetic Resonance Imaging (fMRI) experiments while mono-syllabic speech was presented under AV, auditory-only, or visual-only conditions for speech identification. Behavioural results indicated that the English speakers identified visual speech more quickly than the Japanese speakers, and that the temporal facilitation effect of congruent visual speech was significant in the English speakers but not in the Japanese speakers. Using fMRI data, we examined the functional connectivity among brain regions important for auditory-visual interplay. The results indicated that the English speakers had significantly stronger connectivity between the visual motion area MT and the Heschl’s gyrus compared with the Japanese speakers, which may subserve lower-level visual influences on speech perception in English speakers in a multisensory environment. These results suggested that linguistic experience strongly affects neural connectivity involved in AV speech integration.
format Online
Article
Text
id pubmed-4980767
institution National Center for Biotechnology Information
language English
publishDate 2016
publisher Nature Publishing Group
record_format MEDLINE/PubMed
spelling pubmed-49807672016-08-19 Impact of language on functional connectivity for audiovisual speech integration Shinozaki, Jun Hiroe, Nobuo Sato, Masa-aki Nagamine, Takashi Sekiyama, Kaoru Sci Rep Article Visual information about lip and facial movements plays a role in audiovisual (AV) speech perception. Although this has been widely confirmed, previous behavioural studies have shown interlanguage differences, that is, native Japanese speakers do not integrate auditory and visual speech as closely as native English speakers. To elucidate the neural basis of such interlanguage differences, 22 native English speakers and 24 native Japanese speakers were examined in behavioural or functional Magnetic Resonance Imaging (fMRI) experiments while mono-syllabic speech was presented under AV, auditory-only, or visual-only conditions for speech identification. Behavioural results indicated that the English speakers identified visual speech more quickly than the Japanese speakers, and that the temporal facilitation effect of congruent visual speech was significant in the English speakers but not in the Japanese speakers. Using fMRI data, we examined the functional connectivity among brain regions important for auditory-visual interplay. The results indicated that the English speakers had significantly stronger connectivity between the visual motion area MT and the Heschl’s gyrus compared with the Japanese speakers, which may subserve lower-level visual influences on speech perception in English speakers in a multisensory environment. These results suggested that linguistic experience strongly affects neural connectivity involved in AV speech integration. Nature Publishing Group 2016-08-11 /pmc/articles/PMC4980767/ /pubmed/27510407 http://dx.doi.org/10.1038/srep31388 Text en Copyright © 2016, The Author(s) http://creativecommons.org/licenses/by/4.0/ This work is licensed under a Creative Commons Attribution 4.0 International License. The images or other third party material in this article are included in the article’s Creative Commons license, unless indicated otherwise in the credit line; if the material is not included under the Creative Commons license, users will need to obtain permission from the license holder to reproduce the material. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/
spellingShingle Article
Shinozaki, Jun
Hiroe, Nobuo
Sato, Masa-aki
Nagamine, Takashi
Sekiyama, Kaoru
Impact of language on functional connectivity for audiovisual speech integration
title Impact of language on functional connectivity for audiovisual speech integration
title_full Impact of language on functional connectivity for audiovisual speech integration
title_fullStr Impact of language on functional connectivity for audiovisual speech integration
title_full_unstemmed Impact of language on functional connectivity for audiovisual speech integration
title_short Impact of language on functional connectivity for audiovisual speech integration
title_sort impact of language on functional connectivity for audiovisual speech integration
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4980767/
https://www.ncbi.nlm.nih.gov/pubmed/27510407
http://dx.doi.org/10.1038/srep31388
work_keys_str_mv AT shinozakijun impactoflanguageonfunctionalconnectivityforaudiovisualspeechintegration
AT hiroenobuo impactoflanguageonfunctionalconnectivityforaudiovisualspeechintegration
AT satomasaaki impactoflanguageonfunctionalconnectivityforaudiovisualspeechintegration
AT nagaminetakashi impactoflanguageonfunctionalconnectivityforaudiovisualspeechintegration
AT sekiyamakaoru impactoflanguageonfunctionalconnectivityforaudiovisualspeechintegration