Cargando…

A Quality-Centered Analysis of Eye Tracking Data in Foveated Rendering

This work presents the analysis of data recorded by an eye tracking device in the course of evaluating a foveated rendering approach for head-mounted displays (HMDs). Foveated rendering methods adapt the image synthesis process to the user’s gaze and exploiting the human visual system’s limitations...

Descripción completa

Detalles Bibliográficos
Autores principales: Roth, Thorsten, Weier, Martin, Hinkenjann, André, Li, Yongmin, Slusallek, Philipp
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Bern Open Publishing 2017
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7141096/
https://www.ncbi.nlm.nih.gov/pubmed/33828673
http://dx.doi.org/10.16910/jemr.10.5.2
_version_ 1783519122534957056
author Roth, Thorsten
Weier, Martin
Hinkenjann, André
Li, Yongmin
Slusallek, Philipp
author_facet Roth, Thorsten
Weier, Martin
Hinkenjann, André
Li, Yongmin
Slusallek, Philipp
author_sort Roth, Thorsten
collection PubMed
description This work presents the analysis of data recorded by an eye tracking device in the course of evaluating a foveated rendering approach for head-mounted displays (HMDs). Foveated rendering methods adapt the image synthesis process to the user’s gaze and exploiting the human visual system’s limitations to increase rendering performance. Especially, foveated rendering has great potential when certain requirements have to be fulfilled, like low-latency rendering to cope with high display refresh rates. This is crucial for virtual reality (VR), as a high level of immersion, which can only be achieved with high rendering performance and also helps to reduce nausea, is an important factor in this field. We put things in context by first providing basic information about our rendering system, followed by a description of the user study and the collected data. This data stems from fixation tasks that subjects had to perform while being shown fly-through sequences of virtual scenes on an HMD. These fixation tasks consisted of a combination of various scenes and fixation modes. Besides static fixation targets, moving targets on randomized paths as well as a free focus mode were tested. Using this data, we estimate the precision of the utilized eye tracker and analyze the participants’ accuracy in focusing the displayed fixation targets. Here, we also take a look at eccentricity-dependent quality ratings. Comparing this information with the users’ quality ratings given for the displayed sequences then reveals an interesting connection between fixation modes, fixation accuracy and quality ratings.
format Online
Article
Text
id pubmed-7141096
institution National Center for Biotechnology Information
language English
publishDate 2017
publisher Bern Open Publishing
record_format MEDLINE/PubMed
spelling pubmed-71410962021-04-06 A Quality-Centered Analysis of Eye Tracking Data in Foveated Rendering Roth, Thorsten Weier, Martin Hinkenjann, André Li, Yongmin Slusallek, Philipp J Eye Mov Res Research Article This work presents the analysis of data recorded by an eye tracking device in the course of evaluating a foveated rendering approach for head-mounted displays (HMDs). Foveated rendering methods adapt the image synthesis process to the user’s gaze and exploiting the human visual system’s limitations to increase rendering performance. Especially, foveated rendering has great potential when certain requirements have to be fulfilled, like low-latency rendering to cope with high display refresh rates. This is crucial for virtual reality (VR), as a high level of immersion, which can only be achieved with high rendering performance and also helps to reduce nausea, is an important factor in this field. We put things in context by first providing basic information about our rendering system, followed by a description of the user study and the collected data. This data stems from fixation tasks that subjects had to perform while being shown fly-through sequences of virtual scenes on an HMD. These fixation tasks consisted of a combination of various scenes and fixation modes. Besides static fixation targets, moving targets on randomized paths as well as a free focus mode were tested. Using this data, we estimate the precision of the utilized eye tracker and analyze the participants’ accuracy in focusing the displayed fixation targets. Here, we also take a look at eccentricity-dependent quality ratings. Comparing this information with the users’ quality ratings given for the displayed sequences then reveals an interesting connection between fixation modes, fixation accuracy and quality ratings. Bern Open Publishing 2017-09-28 /pmc/articles/PMC7141096/ /pubmed/33828673 http://dx.doi.org/10.16910/jemr.10.5.2 Text en This work is licensed under a Creative Commons Attribution 4.0 International License, ( https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use and redistribution provided that the original author and source are credited.
spellingShingle Research Article
Roth, Thorsten
Weier, Martin
Hinkenjann, André
Li, Yongmin
Slusallek, Philipp
A Quality-Centered Analysis of Eye Tracking Data in Foveated Rendering
title A Quality-Centered Analysis of Eye Tracking Data in Foveated Rendering
title_full A Quality-Centered Analysis of Eye Tracking Data in Foveated Rendering
title_fullStr A Quality-Centered Analysis of Eye Tracking Data in Foveated Rendering
title_full_unstemmed A Quality-Centered Analysis of Eye Tracking Data in Foveated Rendering
title_short A Quality-Centered Analysis of Eye Tracking Data in Foveated Rendering
title_sort quality-centered analysis of eye tracking data in foveated rendering
topic Research Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7141096/
https://www.ncbi.nlm.nih.gov/pubmed/33828673
http://dx.doi.org/10.16910/jemr.10.5.2
work_keys_str_mv AT roththorsten aqualitycenteredanalysisofeyetrackingdatainfoveatedrendering
AT weiermartin aqualitycenteredanalysisofeyetrackingdatainfoveatedrendering
AT hinkenjannandre aqualitycenteredanalysisofeyetrackingdatainfoveatedrendering
AT liyongmin aqualitycenteredanalysisofeyetrackingdatainfoveatedrendering
AT slusallekphilipp aqualitycenteredanalysisofeyetrackingdatainfoveatedrendering
AT roththorsten qualitycenteredanalysisofeyetrackingdatainfoveatedrendering
AT weiermartin qualitycenteredanalysisofeyetrackingdatainfoveatedrendering
AT hinkenjannandre qualitycenteredanalysisofeyetrackingdatainfoveatedrendering
AT liyongmin qualitycenteredanalysisofeyetrackingdatainfoveatedrendering
AT slusallekphilipp qualitycenteredanalysisofeyetrackingdatainfoveatedrendering