Cargando…

Under What Conditions Can Recursion Be Learned? Effects of Starting Small in Artificial Grammar Learning of Center‐Embedded Structure

It has been suggested that external and/or internal limitations paradoxically may lead to superior learning, that is, the concepts of starting small and less is more (Elman, 1993; Newport, 1990). In this paper, we explore the type of incremental ordering during training that might help learning, and...

Descripción completa

Detalles Bibliográficos
Autores principales: Poletiek, Fenna H., Conway, Christopher M., Ellefson, Michelle R., Lai, Jun, Bocanegra, Bruno R., Christiansen, Morten H.
Formato: Online Artículo Texto
Lenguaje:English
Publicado: John Wiley and Sons Inc. 2018
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6585836/
https://www.ncbi.nlm.nih.gov/pubmed/30264489
http://dx.doi.org/10.1111/cogs.12685
_version_ 1783428785317609472
author Poletiek, Fenna H.
Conway, Christopher M.
Ellefson, Michelle R.
Lai, Jun
Bocanegra, Bruno R.
Christiansen, Morten H.
author_facet Poletiek, Fenna H.
Conway, Christopher M.
Ellefson, Michelle R.
Lai, Jun
Bocanegra, Bruno R.
Christiansen, Morten H.
author_sort Poletiek, Fenna H.
collection PubMed
description It has been suggested that external and/or internal limitations paradoxically may lead to superior learning, that is, the concepts of starting small and less is more (Elman, 1993; Newport, 1990). In this paper, we explore the type of incremental ordering during training that might help learning, and what mechanism explains this facilitation. We report four artificial grammar learning experiments with human participants. In Experiments 1a and 1b we found a beneficial effect of starting small using two types of simple recursive grammars: right‐branching and center‐embedding, with recursive embedded clauses in fixed positions and fixed length. This effect was replicated in Experiment 2 (N = 100). In Experiment 3 and 4, we used a more complex center‐embedded grammar with recursive loops in variable positions, producing strings of variable length. When participants were presented an incremental ordering of training stimuli, as in natural language, they were better able to generalize their knowledge of simple units to more complex units when the training input “grew” according to structural complexity, compared to when it “grew” according to string length. Overall, the results suggest that starting small confers an advantage for learning complex center‐embedded structures when the input is organized according to structural complexity.
format Online
Article
Text
id pubmed-6585836
institution National Center for Biotechnology Information
language English
publishDate 2018
publisher John Wiley and Sons Inc.
record_format MEDLINE/PubMed
spelling pubmed-65858362019-06-27 Under What Conditions Can Recursion Be Learned? Effects of Starting Small in Artificial Grammar Learning of Center‐Embedded Structure Poletiek, Fenna H. Conway, Christopher M. Ellefson, Michelle R. Lai, Jun Bocanegra, Bruno R. Christiansen, Morten H. Cogn Sci Regular Articles It has been suggested that external and/or internal limitations paradoxically may lead to superior learning, that is, the concepts of starting small and less is more (Elman, 1993; Newport, 1990). In this paper, we explore the type of incremental ordering during training that might help learning, and what mechanism explains this facilitation. We report four artificial grammar learning experiments with human participants. In Experiments 1a and 1b we found a beneficial effect of starting small using two types of simple recursive grammars: right‐branching and center‐embedding, with recursive embedded clauses in fixed positions and fixed length. This effect was replicated in Experiment 2 (N = 100). In Experiment 3 and 4, we used a more complex center‐embedded grammar with recursive loops in variable positions, producing strings of variable length. When participants were presented an incremental ordering of training stimuli, as in natural language, they were better able to generalize their knowledge of simple units to more complex units when the training input “grew” according to structural complexity, compared to when it “grew” according to string length. Overall, the results suggest that starting small confers an advantage for learning complex center‐embedded structures when the input is organized according to structural complexity. John Wiley and Sons Inc. 2018-09-27 2018-11 /pmc/articles/PMC6585836/ /pubmed/30264489 http://dx.doi.org/10.1111/cogs.12685 Text en © 2018 The Authors. Cognitive Science ‐ A Multidisciplinary Journal published by Wiley Periodicals, Inc. on behalf of Cognitive Science Society. This is an open access article under the terms of the http://creativecommons.org/licenses/by/4.0/ License, which permits use, distribution and reproduction in any medium, provided the original work is properly cited.
spellingShingle Regular Articles
Poletiek, Fenna H.
Conway, Christopher M.
Ellefson, Michelle R.
Lai, Jun
Bocanegra, Bruno R.
Christiansen, Morten H.
Under What Conditions Can Recursion Be Learned? Effects of Starting Small in Artificial Grammar Learning of Center‐Embedded Structure
title Under What Conditions Can Recursion Be Learned? Effects of Starting Small in Artificial Grammar Learning of Center‐Embedded Structure
title_full Under What Conditions Can Recursion Be Learned? Effects of Starting Small in Artificial Grammar Learning of Center‐Embedded Structure
title_fullStr Under What Conditions Can Recursion Be Learned? Effects of Starting Small in Artificial Grammar Learning of Center‐Embedded Structure
title_full_unstemmed Under What Conditions Can Recursion Be Learned? Effects of Starting Small in Artificial Grammar Learning of Center‐Embedded Structure
title_short Under What Conditions Can Recursion Be Learned? Effects of Starting Small in Artificial Grammar Learning of Center‐Embedded Structure
title_sort under what conditions can recursion be learned? effects of starting small in artificial grammar learning of center‐embedded structure
topic Regular Articles
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6585836/
https://www.ncbi.nlm.nih.gov/pubmed/30264489
http://dx.doi.org/10.1111/cogs.12685
work_keys_str_mv AT poletiekfennah underwhatconditionscanrecursionbelearnedeffectsofstartingsmallinartificialgrammarlearningofcenterembeddedstructure
AT conwaychristopherm underwhatconditionscanrecursionbelearnedeffectsofstartingsmallinartificialgrammarlearningofcenterembeddedstructure
AT ellefsonmicheller underwhatconditionscanrecursionbelearnedeffectsofstartingsmallinartificialgrammarlearningofcenterembeddedstructure
AT laijun underwhatconditionscanrecursionbelearnedeffectsofstartingsmallinartificialgrammarlearningofcenterembeddedstructure
AT bocanegrabrunor underwhatconditionscanrecursionbelearnedeffectsofstartingsmallinartificialgrammarlearningofcenterembeddedstructure
AT christiansenmortenh underwhatconditionscanrecursionbelearnedeffectsofstartingsmallinartificialgrammarlearningofcenterembeddedstructure