Cargando…

Visual Feedback of Tongue Movement for Novel Speech Sound Learning

Pronunciation training studies have yielded important information concerning the processing of audiovisual (AV) information. Second language (L2) learners show increased reliance on bottom-up, multimodal input for speech perception (compared to monolingual individuals). However, little is known abou...

Descripción completa

Detalles Bibliográficos
Autores principales: Katz, William F., Mehta, Sonya
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Frontiers Media S.A. 2015
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4652268/
https://www.ncbi.nlm.nih.gov/pubmed/26635571
http://dx.doi.org/10.3389/fnhum.2015.00612
_version_ 1782401719530946560
author Katz, William F.
Mehta, Sonya
author_facet Katz, William F.
Mehta, Sonya
author_sort Katz, William F.
collection PubMed
description Pronunciation training studies have yielded important information concerning the processing of audiovisual (AV) information. Second language (L2) learners show increased reliance on bottom-up, multimodal input for speech perception (compared to monolingual individuals). However, little is known about the role of viewing one's own speech articulation processes during speech training. The current study investigated whether real-time, visual feedback for tongue movement can improve a speaker's learning of non-native speech sounds. An interactive 3D tongue visualization system based on electromagnetic articulography (EMA) was used in a speech training experiment. Native speakers of American English produced a novel speech sound (/ɖ/; a voiced, coronal, palatal stop) before, during, and after trials in which they viewed their own speech movements using the 3D model. Talkers' productions were evaluated using kinematic (tongue-tip spatial positioning) and acoustic (burst spectra) measures. The results indicated a rapid gain in accuracy associated with visual feedback training. The findings are discussed with respect to neural models for multimodal speech processing.
format Online
Article
Text
id pubmed-4652268
institution National Center for Biotechnology Information
language English
publishDate 2015
publisher Frontiers Media S.A.
record_format MEDLINE/PubMed
spelling pubmed-46522682015-12-03 Visual Feedback of Tongue Movement for Novel Speech Sound Learning Katz, William F. Mehta, Sonya Front Hum Neurosci Neuroscience Pronunciation training studies have yielded important information concerning the processing of audiovisual (AV) information. Second language (L2) learners show increased reliance on bottom-up, multimodal input for speech perception (compared to monolingual individuals). However, little is known about the role of viewing one's own speech articulation processes during speech training. The current study investigated whether real-time, visual feedback for tongue movement can improve a speaker's learning of non-native speech sounds. An interactive 3D tongue visualization system based on electromagnetic articulography (EMA) was used in a speech training experiment. Native speakers of American English produced a novel speech sound (/ɖ/; a voiced, coronal, palatal stop) before, during, and after trials in which they viewed their own speech movements using the 3D model. Talkers' productions were evaluated using kinematic (tongue-tip spatial positioning) and acoustic (burst spectra) measures. The results indicated a rapid gain in accuracy associated with visual feedback training. The findings are discussed with respect to neural models for multimodal speech processing. Frontiers Media S.A. 2015-11-19 /pmc/articles/PMC4652268/ /pubmed/26635571 http://dx.doi.org/10.3389/fnhum.2015.00612 Text en Copyright © 2015 Katz and Mehta. http://creativecommons.org/licenses/by/4.0/ This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
spellingShingle Neuroscience
Katz, William F.
Mehta, Sonya
Visual Feedback of Tongue Movement for Novel Speech Sound Learning
title Visual Feedback of Tongue Movement for Novel Speech Sound Learning
title_full Visual Feedback of Tongue Movement for Novel Speech Sound Learning
title_fullStr Visual Feedback of Tongue Movement for Novel Speech Sound Learning
title_full_unstemmed Visual Feedback of Tongue Movement for Novel Speech Sound Learning
title_short Visual Feedback of Tongue Movement for Novel Speech Sound Learning
title_sort visual feedback of tongue movement for novel speech sound learning
topic Neuroscience
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4652268/
https://www.ncbi.nlm.nih.gov/pubmed/26635571
http://dx.doi.org/10.3389/fnhum.2015.00612
work_keys_str_mv AT katzwilliamf visualfeedbackoftonguemovementfornovelspeechsoundlearning
AT mehtasonya visualfeedbackoftonguemovementfornovelspeechsoundlearning