Cargando…

Manipulating attentional load in sequence learning through random number generation

Implicit learning is often assumed to be an effortless process. However, some artificial grammar learning and sequence learning studies using dual tasks seem to suggest that attention is essential for implicit learning to occur. This discrepancy probably results from the specific type of secondary t...

Descripción completa

Detalles Bibliográficos
Autores principales: Wierzchoń, Michał, Gaillard, Vinciane, Asanowicz, Dariusz, Cleeremans, Axel
Formato: Online Artículo Texto
Lenguaje:English
Publicado: University of Finance and Management in Warsaw 2012
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3376889/
https://www.ncbi.nlm.nih.gov/pubmed/22723816
http://dx.doi.org/10.2478/v10053-008-0114-0
_version_ 1782235890963185664
author Wierzchoń, Michał
Gaillard, Vinciane
Asanowicz, Dariusz
Cleeremans, Axel
author_facet Wierzchoń, Michał
Gaillard, Vinciane
Asanowicz, Dariusz
Cleeremans, Axel
author_sort Wierzchoń, Michał
collection PubMed
description Implicit learning is often assumed to be an effortless process. However, some artificial grammar learning and sequence learning studies using dual tasks seem to suggest that attention is essential for implicit learning to occur. This discrepancy probably results from the specific type of secondary task that is used. Different secondary tasks may engage attentional resources differently and therefore may bias performance on the primary task in different ways. Here, we used a random number generation (RNG) task, which may allow for a closer monitoring of a participant’s engagement in a secondary task than the popular secondary task in sequence learning studies: tone counting (TC). In the first two experiments, we investigated the interference associated with performing RNG concurrently with a serial reaction time (SRT) task. In a third experiment, we compared the effects of RNG and TC. In all three experiments, we directly evaluated participants’ knowledge of the sequence with a subsequent sequence generation task. Sequence learning was consistently observed in all experiments, but was impaired under dual-task conditions. Most importantly, our data suggest that RNG is more demanding and impairs learning to a greater extent than TC. Nevertheless, we failed to observe effects of the secondary task in subsequent sequence generation. Our studies indicate that RNG is a promising task to explore the involvement of attention in the SRT task.
format Online
Article
Text
id pubmed-3376889
institution National Center for Biotechnology Information
language English
publishDate 2012
publisher University of Finance and Management in Warsaw
record_format MEDLINE/PubMed
spelling pubmed-33768892012-06-21 Manipulating attentional load in sequence learning through random number generation Wierzchoń, Michał Gaillard, Vinciane Asanowicz, Dariusz Cleeremans, Axel Adv Cogn Psychol Research Article Implicit learning is often assumed to be an effortless process. However, some artificial grammar learning and sequence learning studies using dual tasks seem to suggest that attention is essential for implicit learning to occur. This discrepancy probably results from the specific type of secondary task that is used. Different secondary tasks may engage attentional resources differently and therefore may bias performance on the primary task in different ways. Here, we used a random number generation (RNG) task, which may allow for a closer monitoring of a participant’s engagement in a secondary task than the popular secondary task in sequence learning studies: tone counting (TC). In the first two experiments, we investigated the interference associated with performing RNG concurrently with a serial reaction time (SRT) task. In a third experiment, we compared the effects of RNG and TC. In all three experiments, we directly evaluated participants’ knowledge of the sequence with a subsequent sequence generation task. Sequence learning was consistently observed in all experiments, but was impaired under dual-task conditions. Most importantly, our data suggest that RNG is more demanding and impairs learning to a greater extent than TC. Nevertheless, we failed to observe effects of the secondary task in subsequent sequence generation. Our studies indicate that RNG is a promising task to explore the involvement of attention in the SRT task. University of Finance and Management in Warsaw 2012-05-21 /pmc/articles/PMC3376889/ /pubmed/22723816 http://dx.doi.org/10.2478/v10053-008-0114-0 Text en Copyright: © 2012 University of Finance and Management in Warsaw http://creativecommons.org/licenses/by/2.5/ This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
spellingShingle Research Article
Wierzchoń, Michał
Gaillard, Vinciane
Asanowicz, Dariusz
Cleeremans, Axel
Manipulating attentional load in sequence learning through random number generation
title Manipulating attentional load in sequence learning through random number generation
title_full Manipulating attentional load in sequence learning through random number generation
title_fullStr Manipulating attentional load in sequence learning through random number generation
title_full_unstemmed Manipulating attentional load in sequence learning through random number generation
title_short Manipulating attentional load in sequence learning through random number generation
title_sort manipulating attentional load in sequence learning through random number generation
topic Research Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3376889/
https://www.ncbi.nlm.nih.gov/pubmed/22723816
http://dx.doi.org/10.2478/v10053-008-0114-0
work_keys_str_mv AT wierzchonmichał manipulatingattentionalloadinsequencelearningthroughrandomnumbergeneration
AT gaillardvinciane manipulatingattentionalloadinsequencelearningthroughrandomnumbergeneration
AT asanowiczdariusz manipulatingattentionalloadinsequencelearningthroughrandomnumbergeneration
AT cleeremansaxel manipulatingattentionalloadinsequencelearningthroughrandomnumbergeneration