Cargando…
Humans use Optokinetic Eye Movements to Track Waypoints for Steering
It is well-established how visual stimuli and self-motion in laboratory conditions reliably elicit retinal-image-stabilizing compensatory eye movements (CEM). Their organization and roles in natural-task gaze strategies is much less understood: are CEM applied in active sampling of visual informatio...
Autores principales: | , , , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Nature Publishing Group UK
2020
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7060325/ https://www.ncbi.nlm.nih.gov/pubmed/32144287 http://dx.doi.org/10.1038/s41598-020-60531-3 |
_version_ | 1783504210025775104 |
---|---|
author | Lappi, Otto Pekkanen, Jami Rinkkala, Paavo Tuhkanen, Samuel Tuononen, Ari Virtanen, Juho-Pekka |
author_facet | Lappi, Otto Pekkanen, Jami Rinkkala, Paavo Tuhkanen, Samuel Tuononen, Ari Virtanen, Juho-Pekka |
author_sort | Lappi, Otto |
collection | PubMed |
description | It is well-established how visual stimuli and self-motion in laboratory conditions reliably elicit retinal-image-stabilizing compensatory eye movements (CEM). Their organization and roles in natural-task gaze strategies is much less understood: are CEM applied in active sampling of visual information in human locomotion in the wild? If so, how? And what are the implications for guidance? Here, we directly compare gaze behavior in the real world (driving a car) and a fixed base simulation steering task. A strong and quantifiable correspondence between self-rotation and CEM counter-rotation is found across a range of speeds. This gaze behavior is “optokinetic”, i.e. optic flow is a sufficient stimulus to spontaneously elicit it in naïve subjects and vestibular stimulation or stereopsis are not critical. Theoretically, the observed nystagmus behavior is consistent with tracking waypoints on the future path, and predicted by waypoint models of locomotor control - but inconsistent with travel point models, such as the popular tangent point model. |
format | Online Article Text |
id | pubmed-7060325 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2020 |
publisher | Nature Publishing Group UK |
record_format | MEDLINE/PubMed |
spelling | pubmed-70603252020-03-18 Humans use Optokinetic Eye Movements to Track Waypoints for Steering Lappi, Otto Pekkanen, Jami Rinkkala, Paavo Tuhkanen, Samuel Tuononen, Ari Virtanen, Juho-Pekka Sci Rep Article It is well-established how visual stimuli and self-motion in laboratory conditions reliably elicit retinal-image-stabilizing compensatory eye movements (CEM). Their organization and roles in natural-task gaze strategies is much less understood: are CEM applied in active sampling of visual information in human locomotion in the wild? If so, how? And what are the implications for guidance? Here, we directly compare gaze behavior in the real world (driving a car) and a fixed base simulation steering task. A strong and quantifiable correspondence between self-rotation and CEM counter-rotation is found across a range of speeds. This gaze behavior is “optokinetic”, i.e. optic flow is a sufficient stimulus to spontaneously elicit it in naïve subjects and vestibular stimulation or stereopsis are not critical. Theoretically, the observed nystagmus behavior is consistent with tracking waypoints on the future path, and predicted by waypoint models of locomotor control - but inconsistent with travel point models, such as the popular tangent point model. Nature Publishing Group UK 2020-03-06 /pmc/articles/PMC7060325/ /pubmed/32144287 http://dx.doi.org/10.1038/s41598-020-60531-3 Text en © The Author(s) 2020 Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/. |
spellingShingle | Article Lappi, Otto Pekkanen, Jami Rinkkala, Paavo Tuhkanen, Samuel Tuononen, Ari Virtanen, Juho-Pekka Humans use Optokinetic Eye Movements to Track Waypoints for Steering |
title | Humans use Optokinetic Eye Movements to Track Waypoints for Steering |
title_full | Humans use Optokinetic Eye Movements to Track Waypoints for Steering |
title_fullStr | Humans use Optokinetic Eye Movements to Track Waypoints for Steering |
title_full_unstemmed | Humans use Optokinetic Eye Movements to Track Waypoints for Steering |
title_short | Humans use Optokinetic Eye Movements to Track Waypoints for Steering |
title_sort | humans use optokinetic eye movements to track waypoints for steering |
topic | Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7060325/ https://www.ncbi.nlm.nih.gov/pubmed/32144287 http://dx.doi.org/10.1038/s41598-020-60531-3 |
work_keys_str_mv | AT lappiotto humansuseoptokineticeyemovementstotrackwaypointsforsteering AT pekkanenjami humansuseoptokineticeyemovementstotrackwaypointsforsteering AT rinkkalapaavo humansuseoptokineticeyemovementstotrackwaypointsforsteering AT tuhkanensamuel humansuseoptokineticeyemovementstotrackwaypointsforsteering AT tuononenari humansuseoptokineticeyemovementstotrackwaypointsforsteering AT virtanenjuhopekka humansuseoptokineticeyemovementstotrackwaypointsforsteering |