Cargando…
Phonological memory in sign language relies on the visuomotor neural system outside the left hemisphere language network
Sign language is an essential medium for everyday social interaction for deaf people and plays a critical role in verbal learning. In particular, language development in those people should heavily rely on the verbal short-term memory (STM) via sign language. Most previous studies compared neural ac...
Autores principales: | , , , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Public Library of Science
2017
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5607140/ https://www.ncbi.nlm.nih.gov/pubmed/28931014 http://dx.doi.org/10.1371/journal.pone.0177599 |
_version_ | 1783265234334515200 |
---|---|
author | Kanazawa, Yuji Nakamura, Kimihiro Ishii, Toru Aso, Toshihiko Yamazaki, Hiroshi Omori, Koichi |
author_facet | Kanazawa, Yuji Nakamura, Kimihiro Ishii, Toru Aso, Toshihiko Yamazaki, Hiroshi Omori, Koichi |
author_sort | Kanazawa, Yuji |
collection | PubMed |
description | Sign language is an essential medium for everyday social interaction for deaf people and plays a critical role in verbal learning. In particular, language development in those people should heavily rely on the verbal short-term memory (STM) via sign language. Most previous studies compared neural activations during signed language processing in deaf signers and those during spoken language processing in hearing speakers. For sign language users, it thus remains unclear how visuospatial inputs are converted into the verbal STM operating in the left-hemisphere language network. Using functional magnetic resonance imaging, the present study investigated neural activation while bilinguals of spoken and signed language were engaged in a sequence memory span task. On each trial, participants viewed a nonsense syllable sequence presented either as written letters or as fingerspelling (4–7 syllables in length) and then held the syllable sequence for 12 s. Behavioral analysis revealed that participants relied on phonological memory while holding verbal information regardless of the type of input modality. At the neural level, this maintenance stage broadly activated the left-hemisphere language network, including the inferior frontal gyrus, supplementary motor area, superior temporal gyrus and inferior parietal lobule, for both letter and fingerspelling conditions. Interestingly, while most participants reported that they relied on phonological memory during maintenance, direct comparisons between letters and fingers revealed strikingly different patterns of neural activation during the same period. Namely, the effortful maintenance of fingerspelling inputs relative to letter inputs activated the left superior parietal lobule and dorsal premotor area, i.e., brain regions known to play a role in visuomotor analysis of hand/arm movements. These findings suggest that the dorsal visuomotor neural system subserves verbal learning via sign language by relaying gestural inputs to the classical left-hemisphere language network. |
format | Online Article Text |
id | pubmed-5607140 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2017 |
publisher | Public Library of Science |
record_format | MEDLINE/PubMed |
spelling | pubmed-56071402017-10-09 Phonological memory in sign language relies on the visuomotor neural system outside the left hemisphere language network Kanazawa, Yuji Nakamura, Kimihiro Ishii, Toru Aso, Toshihiko Yamazaki, Hiroshi Omori, Koichi PLoS One Research Article Sign language is an essential medium for everyday social interaction for deaf people and plays a critical role in verbal learning. In particular, language development in those people should heavily rely on the verbal short-term memory (STM) via sign language. Most previous studies compared neural activations during signed language processing in deaf signers and those during spoken language processing in hearing speakers. For sign language users, it thus remains unclear how visuospatial inputs are converted into the verbal STM operating in the left-hemisphere language network. Using functional magnetic resonance imaging, the present study investigated neural activation while bilinguals of spoken and signed language were engaged in a sequence memory span task. On each trial, participants viewed a nonsense syllable sequence presented either as written letters or as fingerspelling (4–7 syllables in length) and then held the syllable sequence for 12 s. Behavioral analysis revealed that participants relied on phonological memory while holding verbal information regardless of the type of input modality. At the neural level, this maintenance stage broadly activated the left-hemisphere language network, including the inferior frontal gyrus, supplementary motor area, superior temporal gyrus and inferior parietal lobule, for both letter and fingerspelling conditions. Interestingly, while most participants reported that they relied on phonological memory during maintenance, direct comparisons between letters and fingers revealed strikingly different patterns of neural activation during the same period. Namely, the effortful maintenance of fingerspelling inputs relative to letter inputs activated the left superior parietal lobule and dorsal premotor area, i.e., brain regions known to play a role in visuomotor analysis of hand/arm movements. These findings suggest that the dorsal visuomotor neural system subserves verbal learning via sign language by relaying gestural inputs to the classical left-hemisphere language network. Public Library of Science 2017-09-20 /pmc/articles/PMC5607140/ /pubmed/28931014 http://dx.doi.org/10.1371/journal.pone.0177599 Text en © 2017 Kanazawa et al http://creativecommons.org/licenses/by/4.0/ This is an open access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0/) , which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited. |
spellingShingle | Research Article Kanazawa, Yuji Nakamura, Kimihiro Ishii, Toru Aso, Toshihiko Yamazaki, Hiroshi Omori, Koichi Phonological memory in sign language relies on the visuomotor neural system outside the left hemisphere language network |
title | Phonological memory in sign language relies on the visuomotor neural system outside the left hemisphere language network |
title_full | Phonological memory in sign language relies on the visuomotor neural system outside the left hemisphere language network |
title_fullStr | Phonological memory in sign language relies on the visuomotor neural system outside the left hemisphere language network |
title_full_unstemmed | Phonological memory in sign language relies on the visuomotor neural system outside the left hemisphere language network |
title_short | Phonological memory in sign language relies on the visuomotor neural system outside the left hemisphere language network |
title_sort | phonological memory in sign language relies on the visuomotor neural system outside the left hemisphere language network |
topic | Research Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5607140/ https://www.ncbi.nlm.nih.gov/pubmed/28931014 http://dx.doi.org/10.1371/journal.pone.0177599 |
work_keys_str_mv | AT kanazawayuji phonologicalmemoryinsignlanguagereliesonthevisuomotorneuralsystemoutsidethelefthemispherelanguagenetwork AT nakamurakimihiro phonologicalmemoryinsignlanguagereliesonthevisuomotorneuralsystemoutsidethelefthemispherelanguagenetwork AT ishiitoru phonologicalmemoryinsignlanguagereliesonthevisuomotorneuralsystemoutsidethelefthemispherelanguagenetwork AT asotoshihiko phonologicalmemoryinsignlanguagereliesonthevisuomotorneuralsystemoutsidethelefthemispherelanguagenetwork AT yamazakihiroshi phonologicalmemoryinsignlanguagereliesonthevisuomotorneuralsystemoutsidethelefthemispherelanguagenetwork AT omorikoichi phonologicalmemoryinsignlanguagereliesonthevisuomotorneuralsystemoutsidethelefthemispherelanguagenetwork |