Cargando…

A Case for Studying Naturalistic Eye and Head Movements in Virtual Environments

Using head mounted displays (HMDs) in conjunction with virtual reality (VR), vision researchers are able to capture more naturalistic vision in an experimentally controlled setting. Namely, eye movements can be accurately tracked as they occur in concert with head movements as subjects navigate virt...

Descripción completa

Detalles Bibliográficos
Autores principales: Callahan-Flintoft, Chloe, Barentine, Christian, Touryan, Jonathan, Ries, Anthony J.
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Frontiers Media S.A. 2021
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8759101/
https://www.ncbi.nlm.nih.gov/pubmed/35035362
http://dx.doi.org/10.3389/fpsyg.2021.650693
_version_ 1784633041742725120
author Callahan-Flintoft, Chloe
Barentine, Christian
Touryan, Jonathan
Ries, Anthony J.
author_facet Callahan-Flintoft, Chloe
Barentine, Christian
Touryan, Jonathan
Ries, Anthony J.
author_sort Callahan-Flintoft, Chloe
collection PubMed
description Using head mounted displays (HMDs) in conjunction with virtual reality (VR), vision researchers are able to capture more naturalistic vision in an experimentally controlled setting. Namely, eye movements can be accurately tracked as they occur in concert with head movements as subjects navigate virtual environments. A benefit of this approach is that, unlike other mobile eye tracking (ET) set-ups in unconstrained settings, the experimenter has precise control over the location and timing of stimulus presentation, making it easier to compare findings between HMD studies and those that use monitor displays, which account for the bulk of previous work in eye movement research and vision sciences more generally. Here, a visual discrimination paradigm is presented as a proof of concept to demonstrate the applicability of collecting eye and head tracking data from an HMD in VR for vision research. The current work’s contribution is 3-fold: firstly, results demonstrating both the strengths and the weaknesses of recording and classifying eye and head tracking data in VR, secondly, a highly flexible graphical user interface (GUI) used to generate the current experiment, is offered to lower the software development start-up cost of future researchers transitioning to a VR space, and finally, the dataset analyzed here of behavioral, eye and head tracking data synchronized with environmental variables from a task specifically designed to elicit a variety of eye and head movements could be an asset in testing future eye movement classification algorithms.
format Online
Article
Text
id pubmed-8759101
institution National Center for Biotechnology Information
language English
publishDate 2021
publisher Frontiers Media S.A.
record_format MEDLINE/PubMed
spelling pubmed-87591012022-01-15 A Case for Studying Naturalistic Eye and Head Movements in Virtual Environments Callahan-Flintoft, Chloe Barentine, Christian Touryan, Jonathan Ries, Anthony J. Front Psychol Psychology Using head mounted displays (HMDs) in conjunction with virtual reality (VR), vision researchers are able to capture more naturalistic vision in an experimentally controlled setting. Namely, eye movements can be accurately tracked as they occur in concert with head movements as subjects navigate virtual environments. A benefit of this approach is that, unlike other mobile eye tracking (ET) set-ups in unconstrained settings, the experimenter has precise control over the location and timing of stimulus presentation, making it easier to compare findings between HMD studies and those that use monitor displays, which account for the bulk of previous work in eye movement research and vision sciences more generally. Here, a visual discrimination paradigm is presented as a proof of concept to demonstrate the applicability of collecting eye and head tracking data from an HMD in VR for vision research. The current work’s contribution is 3-fold: firstly, results demonstrating both the strengths and the weaknesses of recording and classifying eye and head tracking data in VR, secondly, a highly flexible graphical user interface (GUI) used to generate the current experiment, is offered to lower the software development start-up cost of future researchers transitioning to a VR space, and finally, the dataset analyzed here of behavioral, eye and head tracking data synchronized with environmental variables from a task specifically designed to elicit a variety of eye and head movements could be an asset in testing future eye movement classification algorithms. Frontiers Media S.A. 2021-12-31 /pmc/articles/PMC8759101/ /pubmed/35035362 http://dx.doi.org/10.3389/fpsyg.2021.650693 Text en Copyright © 2021 Callahan-Flintoft, Barentine, Touryan and Ries. https://creativecommons.org/licenses/by/4.0/This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
spellingShingle Psychology
Callahan-Flintoft, Chloe
Barentine, Christian
Touryan, Jonathan
Ries, Anthony J.
A Case for Studying Naturalistic Eye and Head Movements in Virtual Environments
title A Case for Studying Naturalistic Eye and Head Movements in Virtual Environments
title_full A Case for Studying Naturalistic Eye and Head Movements in Virtual Environments
title_fullStr A Case for Studying Naturalistic Eye and Head Movements in Virtual Environments
title_full_unstemmed A Case for Studying Naturalistic Eye and Head Movements in Virtual Environments
title_short A Case for Studying Naturalistic Eye and Head Movements in Virtual Environments
title_sort case for studying naturalistic eye and head movements in virtual environments
topic Psychology
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8759101/
https://www.ncbi.nlm.nih.gov/pubmed/35035362
http://dx.doi.org/10.3389/fpsyg.2021.650693
work_keys_str_mv AT callahanflintoftchloe acaseforstudyingnaturalisticeyeandheadmovementsinvirtualenvironments
AT barentinechristian acaseforstudyingnaturalisticeyeandheadmovementsinvirtualenvironments
AT touryanjonathan acaseforstudyingnaturalisticeyeandheadmovementsinvirtualenvironments
AT riesanthonyj acaseforstudyingnaturalisticeyeandheadmovementsinvirtualenvironments
AT callahanflintoftchloe caseforstudyingnaturalisticeyeandheadmovementsinvirtualenvironments
AT barentinechristian caseforstudyingnaturalisticeyeandheadmovementsinvirtualenvironments
AT touryanjonathan caseforstudyingnaturalisticeyeandheadmovementsinvirtualenvironments
AT riesanthonyj caseforstudyingnaturalisticeyeandheadmovementsinvirtualenvironments