Cargando…
The Effect of Visual Articulatory Information on the Neural Correlates of Non-native Speech Sound Discrimination
Behavioral studies have shown that the ability to discriminate between non-native speech sounds improves after seeing how the sounds are articulated. This study examined the influence of visual articulatory information on the neural correlates of non-native speech sound discrimination. English speak...
Autores principales: | , , , , , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Frontiers Media S.A.
2020
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7019039/ https://www.ncbi.nlm.nih.gov/pubmed/32116609 http://dx.doi.org/10.3389/fnhum.2020.00025 |
_version_ | 1783497437462134784 |
---|---|
author | Plumridge, James M. A. Barham, Michael P. Foley, Denise L. Ware, Anna T. Clark, Gillian M. Albein-Urios, Natalia Hayden, Melissa J. Lum, Jarrad A. G. |
author_facet | Plumridge, James M. A. Barham, Michael P. Foley, Denise L. Ware, Anna T. Clark, Gillian M. Albein-Urios, Natalia Hayden, Melissa J. Lum, Jarrad A. G. |
author_sort | Plumridge, James M. A. |
collection | PubMed |
description | Behavioral studies have shown that the ability to discriminate between non-native speech sounds improves after seeing how the sounds are articulated. This study examined the influence of visual articulatory information on the neural correlates of non-native speech sound discrimination. English speakers’ discrimination of the Hindi dental and retroflex sounds was measured using the mismatch negativity (MMN) event-related potential, before and after they completed one of three 8-min training conditions. In an audio-visual speech training condition (n = 14), each sound was presented with its corresponding visual articulation. In one control condition (n = 14), both sounds were presented with the same visual articulation, resulting in one congruent and one incongruent audio-visual pairing. In another control condition (n = 14), both sounds were presented with the same image of a still face. The control conditions aimed to rule out the possibility that the MMN is influenced by non-specific audio-visual pairings, or by general exposure to the dental and retroflex sounds over the course of the study. The results showed that audio-visual speech training reduced the latency of the MMN but did not affect MMN amplitude. No change in MMN amplitude or latency was observed for the two control conditions. The pattern of results suggests that a relatively short audio-visual speech training session (i.e., 8 min) may increase the speed with which the brain processes non-native speech sound contrasts. The absence of a training effect on MMN amplitude suggests a single session of audio-visual speech training does not lead to the formation of more discrete memory traces for non-native speech sounds. Longer and/or multiple sessions might be needed to influence the MMN amplitude. |
format | Online Article Text |
id | pubmed-7019039 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2020 |
publisher | Frontiers Media S.A. |
record_format | MEDLINE/PubMed |
spelling | pubmed-70190392020-02-28 The Effect of Visual Articulatory Information on the Neural Correlates of Non-native Speech Sound Discrimination Plumridge, James M. A. Barham, Michael P. Foley, Denise L. Ware, Anna T. Clark, Gillian M. Albein-Urios, Natalia Hayden, Melissa J. Lum, Jarrad A. G. Front Hum Neurosci Human Neuroscience Behavioral studies have shown that the ability to discriminate between non-native speech sounds improves after seeing how the sounds are articulated. This study examined the influence of visual articulatory information on the neural correlates of non-native speech sound discrimination. English speakers’ discrimination of the Hindi dental and retroflex sounds was measured using the mismatch negativity (MMN) event-related potential, before and after they completed one of three 8-min training conditions. In an audio-visual speech training condition (n = 14), each sound was presented with its corresponding visual articulation. In one control condition (n = 14), both sounds were presented with the same visual articulation, resulting in one congruent and one incongruent audio-visual pairing. In another control condition (n = 14), both sounds were presented with the same image of a still face. The control conditions aimed to rule out the possibility that the MMN is influenced by non-specific audio-visual pairings, or by general exposure to the dental and retroflex sounds over the course of the study. The results showed that audio-visual speech training reduced the latency of the MMN but did not affect MMN amplitude. No change in MMN amplitude or latency was observed for the two control conditions. The pattern of results suggests that a relatively short audio-visual speech training session (i.e., 8 min) may increase the speed with which the brain processes non-native speech sound contrasts. The absence of a training effect on MMN amplitude suggests a single session of audio-visual speech training does not lead to the formation of more discrete memory traces for non-native speech sounds. Longer and/or multiple sessions might be needed to influence the MMN amplitude. Frontiers Media S.A. 2020-02-07 /pmc/articles/PMC7019039/ /pubmed/32116609 http://dx.doi.org/10.3389/fnhum.2020.00025 Text en Copyright © 2020 Plumridge, Barham, Foley, Ware, Clark, Albein-Urios, Hayden and Lum. http://creativecommons.org/licenses/by/4.0/ This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms. |
spellingShingle | Human Neuroscience Plumridge, James M. A. Barham, Michael P. Foley, Denise L. Ware, Anna T. Clark, Gillian M. Albein-Urios, Natalia Hayden, Melissa J. Lum, Jarrad A. G. The Effect of Visual Articulatory Information on the Neural Correlates of Non-native Speech Sound Discrimination |
title | The Effect of Visual Articulatory Information on the Neural Correlates of Non-native Speech Sound Discrimination |
title_full | The Effect of Visual Articulatory Information on the Neural Correlates of Non-native Speech Sound Discrimination |
title_fullStr | The Effect of Visual Articulatory Information on the Neural Correlates of Non-native Speech Sound Discrimination |
title_full_unstemmed | The Effect of Visual Articulatory Information on the Neural Correlates of Non-native Speech Sound Discrimination |
title_short | The Effect of Visual Articulatory Information on the Neural Correlates of Non-native Speech Sound Discrimination |
title_sort | effect of visual articulatory information on the neural correlates of non-native speech sound discrimination |
topic | Human Neuroscience |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7019039/ https://www.ncbi.nlm.nih.gov/pubmed/32116609 http://dx.doi.org/10.3389/fnhum.2020.00025 |
work_keys_str_mv | AT plumridgejamesma theeffectofvisualarticulatoryinformationontheneuralcorrelatesofnonnativespeechsounddiscrimination AT barhammichaelp theeffectofvisualarticulatoryinformationontheneuralcorrelatesofnonnativespeechsounddiscrimination AT foleydenisel theeffectofvisualarticulatoryinformationontheneuralcorrelatesofnonnativespeechsounddiscrimination AT wareannat theeffectofvisualarticulatoryinformationontheneuralcorrelatesofnonnativespeechsounddiscrimination AT clarkgillianm theeffectofvisualarticulatoryinformationontheneuralcorrelatesofnonnativespeechsounddiscrimination AT albeinuriosnatalia theeffectofvisualarticulatoryinformationontheneuralcorrelatesofnonnativespeechsounddiscrimination AT haydenmelissaj theeffectofvisualarticulatoryinformationontheneuralcorrelatesofnonnativespeechsounddiscrimination AT lumjarradag theeffectofvisualarticulatoryinformationontheneuralcorrelatesofnonnativespeechsounddiscrimination AT plumridgejamesma effectofvisualarticulatoryinformationontheneuralcorrelatesofnonnativespeechsounddiscrimination AT barhammichaelp effectofvisualarticulatoryinformationontheneuralcorrelatesofnonnativespeechsounddiscrimination AT foleydenisel effectofvisualarticulatoryinformationontheneuralcorrelatesofnonnativespeechsounddiscrimination AT wareannat effectofvisualarticulatoryinformationontheneuralcorrelatesofnonnativespeechsounddiscrimination AT clarkgillianm effectofvisualarticulatoryinformationontheneuralcorrelatesofnonnativespeechsounddiscrimination AT albeinuriosnatalia effectofvisualarticulatoryinformationontheneuralcorrelatesofnonnativespeechsounddiscrimination AT haydenmelissaj effectofvisualarticulatoryinformationontheneuralcorrelatesofnonnativespeechsounddiscrimination AT lumjarradag effectofvisualarticulatoryinformationontheneuralcorrelatesofnonnativespeechsounddiscrimination |