Cargando…
Preparatory attention to visual features primarily relies on non-sensory representation
Prior knowledge of behaviorally relevant information promotes preparatory attention before the appearance of stimuli. A key question is how our brain represents the attended information during preparation. A sensory template hypothesis assumes that preparatory signals evoke neural activity patterns...
Autores principales: | , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Nature Publishing Group UK
2022
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9758135/ https://www.ncbi.nlm.nih.gov/pubmed/36526653 http://dx.doi.org/10.1038/s41598-022-26104-2 |
_version_ | 1784851977202565120 |
---|---|
author | Gong, Mengyuan Chen, Yilin Liu, Taosheng |
author_facet | Gong, Mengyuan Chen, Yilin Liu, Taosheng |
author_sort | Gong, Mengyuan |
collection | PubMed |
description | Prior knowledge of behaviorally relevant information promotes preparatory attention before the appearance of stimuli. A key question is how our brain represents the attended information during preparation. A sensory template hypothesis assumes that preparatory signals evoke neural activity patterns that resembled the perception of the attended stimuli, whereas a non-sensory, abstract template hypothesis assumes that preparatory signals reflect the abstraction of attended stimuli. To test these hypotheses, we used fMRI and multivariate analysis to characterize neural activity patterns when human participants were prepared to attend a feature and then select it from a compound stimulus. In an fMRI experiment using basic visual feature (motion direction), we observed reliable decoding of the to-be-attended feature from the preparatory activity in both visual and frontoparietal areas. However, while the neural patterns constructed by a single feature from a baseline task generalized to the activity patterns during stimulus selection, they could not generalize to the activity patterns during preparation. Our findings thus suggest that neural signals during attentional preparation are predominantly non-sensory in nature that may reflect an abstraction of the attended feature. Such a representation could provide efficient and stable guidance of attention. |
format | Online Article Text |
id | pubmed-9758135 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2022 |
publisher | Nature Publishing Group UK |
record_format | MEDLINE/PubMed |
spelling | pubmed-97581352022-12-18 Preparatory attention to visual features primarily relies on non-sensory representation Gong, Mengyuan Chen, Yilin Liu, Taosheng Sci Rep Article Prior knowledge of behaviorally relevant information promotes preparatory attention before the appearance of stimuli. A key question is how our brain represents the attended information during preparation. A sensory template hypothesis assumes that preparatory signals evoke neural activity patterns that resembled the perception of the attended stimuli, whereas a non-sensory, abstract template hypothesis assumes that preparatory signals reflect the abstraction of attended stimuli. To test these hypotheses, we used fMRI and multivariate analysis to characterize neural activity patterns when human participants were prepared to attend a feature and then select it from a compound stimulus. In an fMRI experiment using basic visual feature (motion direction), we observed reliable decoding of the to-be-attended feature from the preparatory activity in both visual and frontoparietal areas. However, while the neural patterns constructed by a single feature from a baseline task generalized to the activity patterns during stimulus selection, they could not generalize to the activity patterns during preparation. Our findings thus suggest that neural signals during attentional preparation are predominantly non-sensory in nature that may reflect an abstraction of the attended feature. Such a representation could provide efficient and stable guidance of attention. Nature Publishing Group UK 2022-12-16 /pmc/articles/PMC9758135/ /pubmed/36526653 http://dx.doi.org/10.1038/s41598-022-26104-2 Text en © The Author(s) 2022 https://creativecommons.org/licenses/by/4.0/Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ (https://creativecommons.org/licenses/by/4.0/) . |
spellingShingle | Article Gong, Mengyuan Chen, Yilin Liu, Taosheng Preparatory attention to visual features primarily relies on non-sensory representation |
title | Preparatory attention to visual features primarily relies on non-sensory representation |
title_full | Preparatory attention to visual features primarily relies on non-sensory representation |
title_fullStr | Preparatory attention to visual features primarily relies on non-sensory representation |
title_full_unstemmed | Preparatory attention to visual features primarily relies on non-sensory representation |
title_short | Preparatory attention to visual features primarily relies on non-sensory representation |
title_sort | preparatory attention to visual features primarily relies on non-sensory representation |
topic | Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9758135/ https://www.ncbi.nlm.nih.gov/pubmed/36526653 http://dx.doi.org/10.1038/s41598-022-26104-2 |
work_keys_str_mv | AT gongmengyuan preparatoryattentiontovisualfeaturesprimarilyreliesonnonsensoryrepresentation AT chenyilin preparatoryattentiontovisualfeaturesprimarilyreliesonnonsensoryrepresentation AT liutaosheng preparatoryattentiontovisualfeaturesprimarilyreliesonnonsensoryrepresentation |