Cargando…

Can gaze control steering?

When steering a trajectory, we direct our gaze to locations (1–3 s ahead) that we want to steer through. How and why are these active gaze patterns conducive to successful steering? While various sources of visual information have been identified that could support steering control, the role of ster...

Descripción completa

Detalles Bibliográficos
Autores principales: Tuhkanen, Samuel, Pekkanen, Jami, Mole, Callum, Wilkie, Richard M., Lappi, Otto
Formato: Online Artículo Texto
Lenguaje:English
Publicado: The Association for Research in Vision and Ophthalmology 2023
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10365140/
https://www.ncbi.nlm.nih.gov/pubmed/37477935
http://dx.doi.org/10.1167/jov.23.7.12
_version_ 1785076989768499200
author Tuhkanen, Samuel
Pekkanen, Jami
Mole, Callum
Wilkie, Richard M.
Lappi, Otto
author_facet Tuhkanen, Samuel
Pekkanen, Jami
Mole, Callum
Wilkie, Richard M.
Lappi, Otto
author_sort Tuhkanen, Samuel
collection PubMed
description When steering a trajectory, we direct our gaze to locations (1–3 s ahead) that we want to steer through. How and why are these active gaze patterns conducive to successful steering? While various sources of visual information have been identified that could support steering control, the role of stereotypical gaze patterns during steering remains unclear. Here, experimental and computational approaches are combined to investigate a possible direct connection between gaze and steering: Is there enough information in gaze direction that it could be used in isolation to steer through a series of waypoints? For this, we test steering models using waypoints supplied from human gaze data, as well as waypoints specified by optical features of the environment. Steering-by-gaze was modeled using a “pure-pursuit” controller (computing a circular trajectory toward a steering point), or a simple “proportional” controller (yaw-rate set proportional to the visual angle of the steering point). Both controllers produced successful steering when using human gaze data as the input. The models generalized using the same parameters across two scenarios: (a) steering through a slalom of three visible waypoints located within lane boundaries and (b) steering a series of connected S bends comprising visible waypoints without a visible road. While the trajectories on average broadly matched those generated by humans, the differences in individual trajectories were not captured by the models. We suggest that “looking where we are going” provides useful information and that this can often be adequate to guide steering. Capturing variation in human steering responses, however, likely requires more sophisticated models or additional sensory information.
format Online
Article
Text
id pubmed-10365140
institution National Center for Biotechnology Information
language English
publishDate 2023
publisher The Association for Research in Vision and Ophthalmology
record_format MEDLINE/PubMed
spelling pubmed-103651402023-07-25 Can gaze control steering? Tuhkanen, Samuel Pekkanen, Jami Mole, Callum Wilkie, Richard M. Lappi, Otto J Vis Article When steering a trajectory, we direct our gaze to locations (1–3 s ahead) that we want to steer through. How and why are these active gaze patterns conducive to successful steering? While various sources of visual information have been identified that could support steering control, the role of stereotypical gaze patterns during steering remains unclear. Here, experimental and computational approaches are combined to investigate a possible direct connection between gaze and steering: Is there enough information in gaze direction that it could be used in isolation to steer through a series of waypoints? For this, we test steering models using waypoints supplied from human gaze data, as well as waypoints specified by optical features of the environment. Steering-by-gaze was modeled using a “pure-pursuit” controller (computing a circular trajectory toward a steering point), or a simple “proportional” controller (yaw-rate set proportional to the visual angle of the steering point). Both controllers produced successful steering when using human gaze data as the input. The models generalized using the same parameters across two scenarios: (a) steering through a slalom of three visible waypoints located within lane boundaries and (b) steering a series of connected S bends comprising visible waypoints without a visible road. While the trajectories on average broadly matched those generated by humans, the differences in individual trajectories were not captured by the models. We suggest that “looking where we are going” provides useful information and that this can often be adequate to guide steering. Capturing variation in human steering responses, however, likely requires more sophisticated models or additional sensory information. The Association for Research in Vision and Ophthalmology 2023-07-21 /pmc/articles/PMC10365140/ /pubmed/37477935 http://dx.doi.org/10.1167/jov.23.7.12 Text en Copyright 2023 The Authors https://creativecommons.org/licenses/by/4.0/This work is licensed under a Creative Commons Attribution 4.0 International License.
spellingShingle Article
Tuhkanen, Samuel
Pekkanen, Jami
Mole, Callum
Wilkie, Richard M.
Lappi, Otto
Can gaze control steering?
title Can gaze control steering?
title_full Can gaze control steering?
title_fullStr Can gaze control steering?
title_full_unstemmed Can gaze control steering?
title_short Can gaze control steering?
title_sort can gaze control steering?
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10365140/
https://www.ncbi.nlm.nih.gov/pubmed/37477935
http://dx.doi.org/10.1167/jov.23.7.12
work_keys_str_mv AT tuhkanensamuel cangazecontrolsteering
AT pekkanenjami cangazecontrolsteering
AT molecallum cangazecontrolsteering
AT wilkierichardm cangazecontrolsteering
AT lappiotto cangazecontrolsteering