Cargando…

Belief embodiment through eye movements facilitates memory-guided navigation

Neural network models optimized for task performance often excel at predicting neural activity but do not explain other properties such as the distributed representation across functionally distinct areas. Distributed representations may arise from animals’ strategies for resource utilization, howev...

Descripción completa

Detalles Bibliográficos
Autores principales: Stavropoulos, Akis, Lakshminarasimhan, Kaushik J., Angelaki, Dora E.
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Cold Spring Harbor Laboratory 2023
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10473632/
https://www.ncbi.nlm.nih.gov/pubmed/37662309
http://dx.doi.org/10.1101/2023.08.21.554107
_version_ 1785100311690477568
author Stavropoulos, Akis
Lakshminarasimhan, Kaushik J.
Angelaki, Dora E.
author_facet Stavropoulos, Akis
Lakshminarasimhan, Kaushik J.
Angelaki, Dora E.
author_sort Stavropoulos, Akis
collection PubMed
description Neural network models optimized for task performance often excel at predicting neural activity but do not explain other properties such as the distributed representation across functionally distinct areas. Distributed representations may arise from animals’ strategies for resource utilization, however, fixation-based paradigms deprive animals of a vital resource: eye movements. During a naturalistic task in which humans use a joystick to steer and catch flashing fireflies in a virtual environment lacking position cues, subjects physically track the latent task variable with their gaze. We show this strategy to be true also during an inertial version of the task in the absence of optic flow and demonstrate that these task-relevant eye movements reflect an embodiment of the subjects’ dynamically evolving internal beliefs about the goal. A neural network model with tuned recurrent connectivity between oculomotor and evidence-integrating frontoparietal circuits accounted for this behavioral strategy. Critically, this model better explained neural data from monkeys’ posterior parietal cortex compared to task-optimized models unconstrained by such an oculomotor-based cognitive strategy. These results highlight the importance of unconstrained movement in working memory computations and establish a functional significance of oculomotor signals for evidence-integration and navigation computations via embodied cognition.
format Online
Article
Text
id pubmed-10473632
institution National Center for Biotechnology Information
language English
publishDate 2023
publisher Cold Spring Harbor Laboratory
record_format MEDLINE/PubMed
spelling pubmed-104736322023-09-02 Belief embodiment through eye movements facilitates memory-guided navigation Stavropoulos, Akis Lakshminarasimhan, Kaushik J. Angelaki, Dora E. bioRxiv Article Neural network models optimized for task performance often excel at predicting neural activity but do not explain other properties such as the distributed representation across functionally distinct areas. Distributed representations may arise from animals’ strategies for resource utilization, however, fixation-based paradigms deprive animals of a vital resource: eye movements. During a naturalistic task in which humans use a joystick to steer and catch flashing fireflies in a virtual environment lacking position cues, subjects physically track the latent task variable with their gaze. We show this strategy to be true also during an inertial version of the task in the absence of optic flow and demonstrate that these task-relevant eye movements reflect an embodiment of the subjects’ dynamically evolving internal beliefs about the goal. A neural network model with tuned recurrent connectivity between oculomotor and evidence-integrating frontoparietal circuits accounted for this behavioral strategy. Critically, this model better explained neural data from monkeys’ posterior parietal cortex compared to task-optimized models unconstrained by such an oculomotor-based cognitive strategy. These results highlight the importance of unconstrained movement in working memory computations and establish a functional significance of oculomotor signals for evidence-integration and navigation computations via embodied cognition. Cold Spring Harbor Laboratory 2023-08-22 /pmc/articles/PMC10473632/ /pubmed/37662309 http://dx.doi.org/10.1101/2023.08.21.554107 Text en https://creativecommons.org/licenses/by-nc/4.0/This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License (https://creativecommons.org/licenses/by-nc/4.0/) , which allows reusers to distribute, remix, adapt, and build upon the material in any medium or format for noncommercial purposes only, and only so long as attribution is given to the creator.
spellingShingle Article
Stavropoulos, Akis
Lakshminarasimhan, Kaushik J.
Angelaki, Dora E.
Belief embodiment through eye movements facilitates memory-guided navigation
title Belief embodiment through eye movements facilitates memory-guided navigation
title_full Belief embodiment through eye movements facilitates memory-guided navigation
title_fullStr Belief embodiment through eye movements facilitates memory-guided navigation
title_full_unstemmed Belief embodiment through eye movements facilitates memory-guided navigation
title_short Belief embodiment through eye movements facilitates memory-guided navigation
title_sort belief embodiment through eye movements facilitates memory-guided navigation
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10473632/
https://www.ncbi.nlm.nih.gov/pubmed/37662309
http://dx.doi.org/10.1101/2023.08.21.554107
work_keys_str_mv AT stavropoulosakis beliefembodimentthrougheyemovementsfacilitatesmemoryguidednavigation
AT lakshminarasimhankaushikj beliefembodimentthrougheyemovementsfacilitatesmemoryguidednavigation
AT angelakidorae beliefembodimentthrougheyemovementsfacilitatesmemoryguidednavigation