Cargando…

Learning and Long-Term Retention of Large-Scale Artificial Languages

Recovering discrete words from continuous speech is one of the first challenges facing language learners. Infants and adults can make use of the statistical structure of utterances to learn the forms of words from unsegmented input, suggesting that this ability may be useful for bootstrapping langua...

Descripción completa

Detalles Bibliográficos
Autores principales: Frank, Michael C., Tenenbaum, Joshua B., Gibson, Edward
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Public Library of Science 2013
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3534673/
https://www.ncbi.nlm.nih.gov/pubmed/23300975
http://dx.doi.org/10.1371/journal.pone.0052500
_version_ 1782475380110655488
author Frank, Michael C.
Tenenbaum, Joshua B.
Gibson, Edward
author_facet Frank, Michael C.
Tenenbaum, Joshua B.
Gibson, Edward
author_sort Frank, Michael C.
collection PubMed
description Recovering discrete words from continuous speech is one of the first challenges facing language learners. Infants and adults can make use of the statistical structure of utterances to learn the forms of words from unsegmented input, suggesting that this ability may be useful for bootstrapping language-specific cues to segmentation. It is unknown, however, whether performance shown in small-scale laboratory demonstrations of “statistical learning” can scale up to allow learning of the lexicons of natural languages, which are orders of magnitude larger. Artificial language experiments with adults can be used to test whether the mechanisms of statistical learning are in principle scalable to larger lexicons. We report data from a large-scale learning experiment that demonstrates that adults can learn words from unsegmented input in much larger languages than previously documented and that they retain the words they learn for years. These results suggest that statistical word segmentation could be scalable to the challenges of lexical acquisition in natural language learning.
format Online
Article
Text
id pubmed-3534673
institution National Center for Biotechnology Information
language English
publishDate 2013
publisher Public Library of Science
record_format MEDLINE/PubMed
spelling pubmed-35346732013-01-08 Learning and Long-Term Retention of Large-Scale Artificial Languages Frank, Michael C. Tenenbaum, Joshua B. Gibson, Edward PLoS One Research Article Recovering discrete words from continuous speech is one of the first challenges facing language learners. Infants and adults can make use of the statistical structure of utterances to learn the forms of words from unsegmented input, suggesting that this ability may be useful for bootstrapping language-specific cues to segmentation. It is unknown, however, whether performance shown in small-scale laboratory demonstrations of “statistical learning” can scale up to allow learning of the lexicons of natural languages, which are orders of magnitude larger. Artificial language experiments with adults can be used to test whether the mechanisms of statistical learning are in principle scalable to larger lexicons. We report data from a large-scale learning experiment that demonstrates that adults can learn words from unsegmented input in much larger languages than previously documented and that they retain the words they learn for years. These results suggest that statistical word segmentation could be scalable to the challenges of lexical acquisition in natural language learning. Public Library of Science 2013-01-02 /pmc/articles/PMC3534673/ /pubmed/23300975 http://dx.doi.org/10.1371/journal.pone.0052500 Text en © 2013 Frank et al http://creativecommons.org/licenses/by/4.0/ This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are properly credited.
spellingShingle Research Article
Frank, Michael C.
Tenenbaum, Joshua B.
Gibson, Edward
Learning and Long-Term Retention of Large-Scale Artificial Languages
title Learning and Long-Term Retention of Large-Scale Artificial Languages
title_full Learning and Long-Term Retention of Large-Scale Artificial Languages
title_fullStr Learning and Long-Term Retention of Large-Scale Artificial Languages
title_full_unstemmed Learning and Long-Term Retention of Large-Scale Artificial Languages
title_short Learning and Long-Term Retention of Large-Scale Artificial Languages
title_sort learning and long-term retention of large-scale artificial languages
topic Research Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3534673/
https://www.ncbi.nlm.nih.gov/pubmed/23300975
http://dx.doi.org/10.1371/journal.pone.0052500
work_keys_str_mv AT frankmichaelc learningandlongtermretentionoflargescaleartificiallanguages
AT tenenbaumjoshuab learningandlongtermretentionoflargescaleartificiallanguages
AT gibsonedward learningandlongtermretentionoflargescaleartificiallanguages