Cargando…

Active inference under visuo-proprioceptive conflict: Simulation and empirical results

It has been suggested that the brain controls hand movements via internal models that rely on visual and proprioceptive cues about the state of the hand. In active inference formulations of such models, the relative influence of each modality on action and perception is determined by how precise (re...

Descripción completa

Detalles Bibliográficos
Autores principales: Limanowski, Jakub, Friston, Karl
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Nature Publishing Group UK 2020
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7055248/
https://www.ncbi.nlm.nih.gov/pubmed/32132646
http://dx.doi.org/10.1038/s41598-020-61097-w
_version_ 1783503335330938880
author Limanowski, Jakub
Friston, Karl
author_facet Limanowski, Jakub
Friston, Karl
author_sort Limanowski, Jakub
collection PubMed
description It has been suggested that the brain controls hand movements via internal models that rely on visual and proprioceptive cues about the state of the hand. In active inference formulations of such models, the relative influence of each modality on action and perception is determined by how precise (reliable) it is expected to be. The ‘top-down’ affordance of expected precision to a particular sensory modality is associated with attention. Here, we asked whether increasing attention to (i.e., the precision of) vision or proprioception would enhance performance in a hand-target phase matching task, in which visual and proprioceptive cues about hand posture were incongruent. We show that in a simple simulated agent—based on predictive coding formulations of active inference—increasing the expected precision of vision or proprioception improved task performance (target matching with the seen or felt hand, respectively) under visuo-proprioceptive conflict. Moreover, we show that this formulation captured the behaviour and self-reported attentional allocation of human participants performing the same task in a virtual reality environment. Together, our results show that selective attention can balance the impact of (conflicting) visual and proprioceptive cues on action—rendering attention a key mechanism for a flexible body representation for action.
format Online
Article
Text
id pubmed-7055248
institution National Center for Biotechnology Information
language English
publishDate 2020
publisher Nature Publishing Group UK
record_format MEDLINE/PubMed
spelling pubmed-70552482020-03-12 Active inference under visuo-proprioceptive conflict: Simulation and empirical results Limanowski, Jakub Friston, Karl Sci Rep Article It has been suggested that the brain controls hand movements via internal models that rely on visual and proprioceptive cues about the state of the hand. In active inference formulations of such models, the relative influence of each modality on action and perception is determined by how precise (reliable) it is expected to be. The ‘top-down’ affordance of expected precision to a particular sensory modality is associated with attention. Here, we asked whether increasing attention to (i.e., the precision of) vision or proprioception would enhance performance in a hand-target phase matching task, in which visual and proprioceptive cues about hand posture were incongruent. We show that in a simple simulated agent—based on predictive coding formulations of active inference—increasing the expected precision of vision or proprioception improved task performance (target matching with the seen or felt hand, respectively) under visuo-proprioceptive conflict. Moreover, we show that this formulation captured the behaviour and self-reported attentional allocation of human participants performing the same task in a virtual reality environment. Together, our results show that selective attention can balance the impact of (conflicting) visual and proprioceptive cues on action—rendering attention a key mechanism for a flexible body representation for action. Nature Publishing Group UK 2020-03-04 /pmc/articles/PMC7055248/ /pubmed/32132646 http://dx.doi.org/10.1038/s41598-020-61097-w Text en © The Author(s) 2020 Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/.
spellingShingle Article
Limanowski, Jakub
Friston, Karl
Active inference under visuo-proprioceptive conflict: Simulation and empirical results
title Active inference under visuo-proprioceptive conflict: Simulation and empirical results
title_full Active inference under visuo-proprioceptive conflict: Simulation and empirical results
title_fullStr Active inference under visuo-proprioceptive conflict: Simulation and empirical results
title_full_unstemmed Active inference under visuo-proprioceptive conflict: Simulation and empirical results
title_short Active inference under visuo-proprioceptive conflict: Simulation and empirical results
title_sort active inference under visuo-proprioceptive conflict: simulation and empirical results
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7055248/
https://www.ncbi.nlm.nih.gov/pubmed/32132646
http://dx.doi.org/10.1038/s41598-020-61097-w
work_keys_str_mv AT limanowskijakub activeinferenceundervisuoproprioceptiveconflictsimulationandempiricalresults
AT fristonkarl activeinferenceundervisuoproprioceptiveconflictsimulationandempiricalresults