Cargando…
Task-dependence in scene perception: Head unrestrained viewing using mobile eye-tracking
Real-world scene perception is typically studied in the laboratory using static picture viewing with restrained head position. Consequently, the transfer of results obtained in this paradigm to real-word scenarios has been questioned. The advancement of mobile eye-trackers and the progress in image...
Autores principales: | , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
The Association for Research in Vision and Ophthalmology
2020
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7409614/ https://www.ncbi.nlm.nih.gov/pubmed/32392286 http://dx.doi.org/10.1167/jov.20.5.3 |
_version_ | 1783568094148427776 |
---|---|
author | Backhaus, Daniel Engbert, Ralf Rothkegel, Lars O. M. Trukenbrod, Hans A. |
author_facet | Backhaus, Daniel Engbert, Ralf Rothkegel, Lars O. M. Trukenbrod, Hans A. |
author_sort | Backhaus, Daniel |
collection | PubMed |
description | Real-world scene perception is typically studied in the laboratory using static picture viewing with restrained head position. Consequently, the transfer of results obtained in this paradigm to real-word scenarios has been questioned. The advancement of mobile eye-trackers and the progress in image processing, however, permit a more natural experimental setup that, at the same time, maintains the high experimental control from the standard laboratory setting. We investigated eye movements while participants were standing in front of a projector screen and explored images under four specific task instructions. Eye movements were recorded with a mobile eye-tracking device and raw gaze data were transformed from head-centered into image-centered coordinates. We observed differences between tasks in temporal and spatial eye-movement parameters and found that the bias to fixate images near the center differed between tasks. Our results demonstrate that current mobile eye-tracking technology and a highly controlled design support the study of fine-scaled task dependencies in an experimental setting that permits more natural viewing behavior than the static picture viewing paradigm. |
format | Online Article Text |
id | pubmed-7409614 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2020 |
publisher | The Association for Research in Vision and Ophthalmology |
record_format | MEDLINE/PubMed |
spelling | pubmed-74096142020-08-19 Task-dependence in scene perception: Head unrestrained viewing using mobile eye-tracking Backhaus, Daniel Engbert, Ralf Rothkegel, Lars O. M. Trukenbrod, Hans A. J Vis Article Real-world scene perception is typically studied in the laboratory using static picture viewing with restrained head position. Consequently, the transfer of results obtained in this paradigm to real-word scenarios has been questioned. The advancement of mobile eye-trackers and the progress in image processing, however, permit a more natural experimental setup that, at the same time, maintains the high experimental control from the standard laboratory setting. We investigated eye movements while participants were standing in front of a projector screen and explored images under four specific task instructions. Eye movements were recorded with a mobile eye-tracking device and raw gaze data were transformed from head-centered into image-centered coordinates. We observed differences between tasks in temporal and spatial eye-movement parameters and found that the bias to fixate images near the center differed between tasks. Our results demonstrate that current mobile eye-tracking technology and a highly controlled design support the study of fine-scaled task dependencies in an experimental setting that permits more natural viewing behavior than the static picture viewing paradigm. The Association for Research in Vision and Ophthalmology 2020-05-11 /pmc/articles/PMC7409614/ /pubmed/32392286 http://dx.doi.org/10.1167/jov.20.5.3 Text en Copyright 2020 The Authors http://creativecommons.org/licenses/by/4.0/ This work is licensed under a Creative Commons Attribution 4.0 International License. |
spellingShingle | Article Backhaus, Daniel Engbert, Ralf Rothkegel, Lars O. M. Trukenbrod, Hans A. Task-dependence in scene perception: Head unrestrained viewing using mobile eye-tracking |
title | Task-dependence in scene perception: Head unrestrained viewing using mobile eye-tracking |
title_full | Task-dependence in scene perception: Head unrestrained viewing using mobile eye-tracking |
title_fullStr | Task-dependence in scene perception: Head unrestrained viewing using mobile eye-tracking |
title_full_unstemmed | Task-dependence in scene perception: Head unrestrained viewing using mobile eye-tracking |
title_short | Task-dependence in scene perception: Head unrestrained viewing using mobile eye-tracking |
title_sort | task-dependence in scene perception: head unrestrained viewing using mobile eye-tracking |
topic | Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7409614/ https://www.ncbi.nlm.nih.gov/pubmed/32392286 http://dx.doi.org/10.1167/jov.20.5.3 |
work_keys_str_mv | AT backhausdaniel taskdependenceinsceneperceptionheadunrestrainedviewingusingmobileeyetracking AT engbertralf taskdependenceinsceneperceptionheadunrestrainedviewingusingmobileeyetracking AT rothkegellarsom taskdependenceinsceneperceptionheadunrestrainedviewingusingmobileeyetracking AT trukenbrodhansa taskdependenceinsceneperceptionheadunrestrainedviewingusingmobileeyetracking |