Cargando…

Eye and head movements while looking at rotated scenes in VR. Session "Beyond the screen's edge" at the 20th European Conference on Eye Movement Research (ECEM) in Alicante, 19.8.2019

We examined the extent to which image shape (square vs. circle), image rotation, and image content (landscapes vs. fractal images) influenced eye and head movements. Both the eyes and head were tracked while observers looked at natural scenes in a virtual reality (VR) environment. In line with previ...

Descripción completa

Detalles Bibliográficos
Autores principales: Anderson, Nicola C., Bischof, Walter F.
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Bern Open Publishing 2019
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7917486/
https://www.ncbi.nlm.nih.gov/pubmed/33828771
http://dx.doi.org/10.16910/jemr.12.7.11
_version_ 1783657710246428672
author Anderson, Nicola C.
Bischof, Walter F.
author_facet Anderson, Nicola C.
Bischof, Walter F.
author_sort Anderson, Nicola C.
collection PubMed
description We examined the extent to which image shape (square vs. circle), image rotation, and image content (landscapes vs. fractal images) influenced eye and head movements. Both the eyes and head were tracked while observers looked at natural scenes in a virtual reality (VR) environment. In line with previous work, we found a horizontal bias in saccade directions, but this was affected by both the image shape and its content. Interestingly, when viewing landscapes (but not fractals), observers rotated their head in line with the image rotation, presumably to make saccades in cardinal, rather than oblique, directions. We discuss our findings in relation to current theories on eye movement control, and how insights from VR might inform traditional eyetracking studies. - Part 2: Observers looked at panoramic, 360 degree scenes using VR goggles while eye and head movements were tracked. Fixations were determined using IDT (Salvucci & Goldberg, 2000) adapted to a spherical coordinate system. We then analyzed a) the spatial distribution of fixations and the distribution of saccade directions, b) the spatial distribution of head positions and the distribution of head movements, and c) the relation between gaze and head movements. We found that, for landscape scenes, gaze and head best fit the allocentric frame defined by the scene horizon, especially when taking head tilt (i.e., head rotation around the view axis) into account. For fractal scenes, which are isotropic on average, the bias toward a body-centric frame gaze is weak for gaze and strong for the head. Furthermore, our data show that eye and head movements are closely linked in space and time in stereotypical ways, with volitional eye movements predominantly leading the head. We discuss our results in terms of models of visual exploratory behavior in panoramic scenes, both in virtual and real environments. Video stream: https://vimeo.com/356859979 Production and publication of the video stream was sponsored by SCIANS Ltd http://www.scians.ch/
format Online
Article
Text
id pubmed-7917486
institution National Center for Biotechnology Information
language English
publishDate 2019
publisher Bern Open Publishing
record_format MEDLINE/PubMed
spelling pubmed-79174862021-04-06 Eye and head movements while looking at rotated scenes in VR. Session "Beyond the screen's edge" at the 20th European Conference on Eye Movement Research (ECEM) in Alicante, 19.8.2019 Anderson, Nicola C. Bischof, Walter F. J Eye Mov Res Research Article We examined the extent to which image shape (square vs. circle), image rotation, and image content (landscapes vs. fractal images) influenced eye and head movements. Both the eyes and head were tracked while observers looked at natural scenes in a virtual reality (VR) environment. In line with previous work, we found a horizontal bias in saccade directions, but this was affected by both the image shape and its content. Interestingly, when viewing landscapes (but not fractals), observers rotated their head in line with the image rotation, presumably to make saccades in cardinal, rather than oblique, directions. We discuss our findings in relation to current theories on eye movement control, and how insights from VR might inform traditional eyetracking studies. - Part 2: Observers looked at panoramic, 360 degree scenes using VR goggles while eye and head movements were tracked. Fixations were determined using IDT (Salvucci & Goldberg, 2000) adapted to a spherical coordinate system. We then analyzed a) the spatial distribution of fixations and the distribution of saccade directions, b) the spatial distribution of head positions and the distribution of head movements, and c) the relation between gaze and head movements. We found that, for landscape scenes, gaze and head best fit the allocentric frame defined by the scene horizon, especially when taking head tilt (i.e., head rotation around the view axis) into account. For fractal scenes, which are isotropic on average, the bias toward a body-centric frame gaze is weak for gaze and strong for the head. Furthermore, our data show that eye and head movements are closely linked in space and time in stereotypical ways, with volitional eye movements predominantly leading the head. We discuss our results in terms of models of visual exploratory behavior in panoramic scenes, both in virtual and real environments. Video stream: https://vimeo.com/356859979 Production and publication of the video stream was sponsored by SCIANS Ltd http://www.scians.ch/ Bern Open Publishing 2019-11-25 /pmc/articles/PMC7917486/ /pubmed/33828771 http://dx.doi.org/10.16910/jemr.12.7.11 Text en This work is licensed under a Creative Commons Attribution 4.0 International License, ( https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use and redistribution provided that the original author and source are credited.
spellingShingle Research Article
Anderson, Nicola C.
Bischof, Walter F.
Eye and head movements while looking at rotated scenes in VR. Session "Beyond the screen's edge" at the 20th European Conference on Eye Movement Research (ECEM) in Alicante, 19.8.2019
title Eye and head movements while looking at rotated scenes in VR. Session "Beyond the screen's edge" at the 20th European Conference on Eye Movement Research (ECEM) in Alicante, 19.8.2019
title_full Eye and head movements while looking at rotated scenes in VR. Session "Beyond the screen's edge" at the 20th European Conference on Eye Movement Research (ECEM) in Alicante, 19.8.2019
title_fullStr Eye and head movements while looking at rotated scenes in VR. Session "Beyond the screen's edge" at the 20th European Conference on Eye Movement Research (ECEM) in Alicante, 19.8.2019
title_full_unstemmed Eye and head movements while looking at rotated scenes in VR. Session "Beyond the screen's edge" at the 20th European Conference on Eye Movement Research (ECEM) in Alicante, 19.8.2019
title_short Eye and head movements while looking at rotated scenes in VR. Session "Beyond the screen's edge" at the 20th European Conference on Eye Movement Research (ECEM) in Alicante, 19.8.2019
title_sort eye and head movements while looking at rotated scenes in vr. session "beyond the screen's edge" at the 20th european conference on eye movement research (ecem) in alicante, 19.8.2019
topic Research Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7917486/
https://www.ncbi.nlm.nih.gov/pubmed/33828771
http://dx.doi.org/10.16910/jemr.12.7.11
work_keys_str_mv AT andersonnicolac eyeandheadmovementswhilelookingatrotatedscenesinvrsessionbeyondthescreensedgeatthe20theuropeanconferenceoneyemovementresearcheceminalicante1982019
AT bischofwalterf eyeandheadmovementswhilelookingatrotatedscenesinvrsessionbeyondthescreensedgeatthe20theuropeanconferenceoneyemovementresearcheceminalicante1982019