Cargando…
Disentangling sequential from hierarchical learning in Artificial Grammar Learning: Evidence from a modified Simon Task
In this paper we probe the interaction between sequential and hierarchical learning by investigating implicit learning in a group of school-aged children. We administered a serial reaction time task, in the form of a modified Simon Task in which the stimuli were organised following the rules of two...
Autores principales: | , , , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Public Library of Science
2020
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7224470/ https://www.ncbi.nlm.nih.gov/pubmed/32407332 http://dx.doi.org/10.1371/journal.pone.0232687 |
_version_ | 1783533905549197312 |
---|---|
author | Vender, Maria Krivochen, Diego Gabriel Compostella, Arianna Phillips, Beth Delfitto, Denis Saddy, Douglas |
author_facet | Vender, Maria Krivochen, Diego Gabriel Compostella, Arianna Phillips, Beth Delfitto, Denis Saddy, Douglas |
author_sort | Vender, Maria |
collection | PubMed |
description | In this paper we probe the interaction between sequential and hierarchical learning by investigating implicit learning in a group of school-aged children. We administered a serial reaction time task, in the form of a modified Simon Task in which the stimuli were organised following the rules of two distinct artificial grammars, specifically Lindenmayer systems: the Fibonacci grammar (Fib) and the Skip grammar (a modification of the former). The choice of grammars is determined by the goal of this study, which is to investigate how sensitivity to structure emerges in the course of exposure to an input whose surface transitional properties (by hypothesis) bootstrap structure. The studies conducted to date have been mainly designed to investigate low-level superficial regularities, learnable in purely statistical terms, whereas hierarchical learning has not been effectively investigated yet. The possibility to directly pinpoint the interplay between sequential and hierarchical learning is instead at the core of our study: we presented children with two grammars, Fib and Skip, which share the same transitional regularities, thus providing identical opportunities for sequential learning, while crucially differing in their hierarchical structure. More particularly, there are specific points in the sequence (k-points), which, despite giving rise to the same transitional regularities in the two grammars, support hierarchical reconstruction in Fib but not in Skip. In our protocol, children were simply asked to perform a traditional Simon Task, and they were completely unaware of the real purposes of the task. Results indicate that sequential learning occurred in both grammars, as shown by the decrease in reaction times throughout the task, while differences were found in the sensitivity to k-points: these, we contend, play a role in hierarchical reconstruction in Fib, whereas they are devoid of structural significance in Skip. More particularly, we found that children were faster in correspondence to k-points in sequences produced by Fib, thus providing an entirely new kind of evidence for the hypothesis that implicit learning involves an early activation of strategies of hierarchical reconstruction, based on a straightforward interplay with the statistically-based computation of transitional regularities on the sequences of symbols. |
format | Online Article Text |
id | pubmed-7224470 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2020 |
publisher | Public Library of Science |
record_format | MEDLINE/PubMed |
spelling | pubmed-72244702020-06-01 Disentangling sequential from hierarchical learning in Artificial Grammar Learning: Evidence from a modified Simon Task Vender, Maria Krivochen, Diego Gabriel Compostella, Arianna Phillips, Beth Delfitto, Denis Saddy, Douglas PLoS One Research Article In this paper we probe the interaction between sequential and hierarchical learning by investigating implicit learning in a group of school-aged children. We administered a serial reaction time task, in the form of a modified Simon Task in which the stimuli were organised following the rules of two distinct artificial grammars, specifically Lindenmayer systems: the Fibonacci grammar (Fib) and the Skip grammar (a modification of the former). The choice of grammars is determined by the goal of this study, which is to investigate how sensitivity to structure emerges in the course of exposure to an input whose surface transitional properties (by hypothesis) bootstrap structure. The studies conducted to date have been mainly designed to investigate low-level superficial regularities, learnable in purely statistical terms, whereas hierarchical learning has not been effectively investigated yet. The possibility to directly pinpoint the interplay between sequential and hierarchical learning is instead at the core of our study: we presented children with two grammars, Fib and Skip, which share the same transitional regularities, thus providing identical opportunities for sequential learning, while crucially differing in their hierarchical structure. More particularly, there are specific points in the sequence (k-points), which, despite giving rise to the same transitional regularities in the two grammars, support hierarchical reconstruction in Fib but not in Skip. In our protocol, children were simply asked to perform a traditional Simon Task, and they were completely unaware of the real purposes of the task. Results indicate that sequential learning occurred in both grammars, as shown by the decrease in reaction times throughout the task, while differences were found in the sensitivity to k-points: these, we contend, play a role in hierarchical reconstruction in Fib, whereas they are devoid of structural significance in Skip. More particularly, we found that children were faster in correspondence to k-points in sequences produced by Fib, thus providing an entirely new kind of evidence for the hypothesis that implicit learning involves an early activation of strategies of hierarchical reconstruction, based on a straightforward interplay with the statistically-based computation of transitional regularities on the sequences of symbols. Public Library of Science 2020-05-14 /pmc/articles/PMC7224470/ /pubmed/32407332 http://dx.doi.org/10.1371/journal.pone.0232687 Text en © 2020 Vender et al http://creativecommons.org/licenses/by/4.0/ This is an open access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0/) , which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited. |
spellingShingle | Research Article Vender, Maria Krivochen, Diego Gabriel Compostella, Arianna Phillips, Beth Delfitto, Denis Saddy, Douglas Disentangling sequential from hierarchical learning in Artificial Grammar Learning: Evidence from a modified Simon Task |
title | Disentangling sequential from hierarchical learning in Artificial Grammar Learning: Evidence from a modified Simon Task |
title_full | Disentangling sequential from hierarchical learning in Artificial Grammar Learning: Evidence from a modified Simon Task |
title_fullStr | Disentangling sequential from hierarchical learning in Artificial Grammar Learning: Evidence from a modified Simon Task |
title_full_unstemmed | Disentangling sequential from hierarchical learning in Artificial Grammar Learning: Evidence from a modified Simon Task |
title_short | Disentangling sequential from hierarchical learning in Artificial Grammar Learning: Evidence from a modified Simon Task |
title_sort | disentangling sequential from hierarchical learning in artificial grammar learning: evidence from a modified simon task |
topic | Research Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7224470/ https://www.ncbi.nlm.nih.gov/pubmed/32407332 http://dx.doi.org/10.1371/journal.pone.0232687 |
work_keys_str_mv | AT vendermaria disentanglingsequentialfromhierarchicallearninginartificialgrammarlearningevidencefromamodifiedsimontask AT krivochendiegogabriel disentanglingsequentialfromhierarchicallearninginartificialgrammarlearningevidencefromamodifiedsimontask AT compostellaarianna disentanglingsequentialfromhierarchicallearninginartificialgrammarlearningevidencefromamodifiedsimontask AT phillipsbeth disentanglingsequentialfromhierarchicallearninginartificialgrammarlearningevidencefromamodifiedsimontask AT delfittodenis disentanglingsequentialfromhierarchicallearninginartificialgrammarlearningevidencefromamodifiedsimontask AT saddydouglas disentanglingsequentialfromhierarchicallearninginartificialgrammarlearningevidencefromamodifiedsimontask |