Self-motion perception and sequential decision-making: where are we heading?

To navigate and guide adaptive behaviour in a dynamic environment, animals must accurately estimate their own motion relative to the external world. This is a fundamentally multisensory process involving integration of visual, vestibular and kinesthetic inputs. Ideal observer models, paired with car...

Descripción completa

Detalles Bibliográficos
Autores principales: Jerjian, Steven J., Harsch, Devin R., Fetsch, Christopher R.
Formato: Online Artículo Texto
Lenguaje:English
Publicado: The Royal Society 2023
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10404932/
https://www.ncbi.nlm.nih.gov/pubmed/37545301
http://dx.doi.org/10.1098/rstb.2022.0333
_version_ 1785085408646791168
author Jerjian, Steven J.
Harsch, Devin R.
Fetsch, Christopher R.
author_facet Jerjian, Steven J.
Harsch, Devin R.
Fetsch, Christopher R.
author_sort Jerjian, Steven J.
collection PubMed
description To navigate and guide adaptive behaviour in a dynamic environment, animals must accurately estimate their own motion relative to the external world. This is a fundamentally multisensory process involving integration of visual, vestibular and kinesthetic inputs. Ideal observer models, paired with careful neurophysiological investigation, helped to reveal how visual and vestibular signals are combined to support perception of linear self-motion direction, or heading. Recent work has extended these findings by emphasizing the dimension of time, both with regard to stimulus dynamics and the trade-off between speed and accuracy. Both time and certainty—i.e. the degree of confidence in a multisensory decision—are essential to the ecological goals of the system: terminating a decision process is necessary for timely action, and predicting one's accuracy is critical for making multiple decisions in a sequence, as in navigation. Here, we summarize a leading model for multisensory decision-making, then show how the model can be extended to study confidence in heading discrimination. Lastly, we preview ongoing efforts to bridge self-motion perception and navigation per se, including closed-loop virtual reality and active self-motion. The design of unconstrained, ethologically inspired tasks, accompanied by large-scale neural recordings, raise promise for a deeper understanding of spatial perception and decision-making in the behaving animal. This article is part of the theme issue ‘Decision and control processes in multisensory perception’.
format Online
Article
Text
id pubmed-10404932
institution National Center for Biotechnology Information
language English
publishDate 2023
publisher The Royal Society
record_format MEDLINE/PubMed
spelling pubmed-104049322023-08-08 Self-motion perception and sequential decision-making: where are we heading? Jerjian, Steven J. Harsch, Devin R. Fetsch, Christopher R. Philos Trans R Soc Lond B Biol Sci Articles To navigate and guide adaptive behaviour in a dynamic environment, animals must accurately estimate their own motion relative to the external world. This is a fundamentally multisensory process involving integration of visual, vestibular and kinesthetic inputs. Ideal observer models, paired with careful neurophysiological investigation, helped to reveal how visual and vestibular signals are combined to support perception of linear self-motion direction, or heading. Recent work has extended these findings by emphasizing the dimension of time, both with regard to stimulus dynamics and the trade-off between speed and accuracy. Both time and certainty—i.e. the degree of confidence in a multisensory decision—are essential to the ecological goals of the system: terminating a decision process is necessary for timely action, and predicting one's accuracy is critical for making multiple decisions in a sequence, as in navigation. Here, we summarize a leading model for multisensory decision-making, then show how the model can be extended to study confidence in heading discrimination. Lastly, we preview ongoing efforts to bridge self-motion perception and navigation per se, including closed-loop virtual reality and active self-motion. The design of unconstrained, ethologically inspired tasks, accompanied by large-scale neural recordings, raise promise for a deeper understanding of spatial perception and decision-making in the behaving animal. This article is part of the theme issue ‘Decision and control processes in multisensory perception’. The Royal Society 2023-09-25 2023-08-07 /pmc/articles/PMC10404932/ /pubmed/37545301 http://dx.doi.org/10.1098/rstb.2022.0333 Text en © 2023 The Authors. https://creativecommons.org/licenses/by/4.0/Published by the Royal Society under the terms of the Creative Commons Attribution License http://creativecommons.org/licenses/by/4.0/ (https://creativecommons.org/licenses/by/4.0/) , which permits unrestricted use, provided the original author and source are credited.
spellingShingle Articles
Jerjian, Steven J.
Harsch, Devin R.
Fetsch, Christopher R.
Self-motion perception and sequential decision-making: where are we heading?
title Self-motion perception and sequential decision-making: where are we heading?
title_full Self-motion perception and sequential decision-making: where are we heading?
title_fullStr Self-motion perception and sequential decision-making: where are we heading?
title_full_unstemmed Self-motion perception and sequential decision-making: where are we heading?
title_short Self-motion perception and sequential decision-making: where are we heading?
title_sort self-motion perception and sequential decision-making: where are we heading?
topic Articles
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10404932/
https://www.ncbi.nlm.nih.gov/pubmed/37545301
http://dx.doi.org/10.1098/rstb.2022.0333
work_keys_str_mv AT jerjianstevenj selfmotionperceptionandsequentialdecisionmakingwhereareweheading
AT harschdevinr selfmotionperceptionandsequentialdecisionmakingwhereareweheading
AT fetschchristopherr selfmotionperceptionandsequentialdecisionmakingwhereareweheading