Cargando…
Combining cues to judge distance and direction in an immersive virtual reality environment
When we move, the visual direction of objects in the environment can change substantially. Compared with our understanding of depth perception, the problem the visual system faces in computing this change is relatively poorly understood. Here, we tested the extent to which participants’ judgments of...
Autores principales: | , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
The Association for Research in Vision and Ophthalmology
2021
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8083085/ https://www.ncbi.nlm.nih.gov/pubmed/33900366 http://dx.doi.org/10.1167/jov.21.4.10 |
_version_ | 1783685962662936576 |
---|---|
author | Scarfe, Peter Glennerster, Andrew |
author_facet | Scarfe, Peter Glennerster, Andrew |
author_sort | Scarfe, Peter |
collection | PubMed |
description | When we move, the visual direction of objects in the environment can change substantially. Compared with our understanding of depth perception, the problem the visual system faces in computing this change is relatively poorly understood. Here, we tested the extent to which participants’ judgments of visual direction could be predicted by standard cue combination rules. Participants were tested in virtual reality using a head-mounted display. In a simulated room, they judged the position of an object at one location, before walking to another location in the room and judging, in a second interval, whether an object was at the expected visual direction of the first. By manipulating the scale of the room across intervals, which was subjectively invisible to observers, we put two classes of cue into conflict, one that depends only on visual information and one that uses proprioceptive information to scale any reconstruction of the scene. We find that the sensitivity to changes in one class of cue while keeping the other constant provides a good prediction of performance when both cues vary, consistent with the standard cue combination framework. Nevertheless, by comparing judgments of visual direction with those of distance, we show that judgments of visual direction and distance are mutually inconsistent. We discuss why there is no need for any contradiction between these two conclusions. |
format | Online Article Text |
id | pubmed-8083085 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2021 |
publisher | The Association for Research in Vision and Ophthalmology |
record_format | MEDLINE/PubMed |
spelling | pubmed-80830852021-05-05 Combining cues to judge distance and direction in an immersive virtual reality environment Scarfe, Peter Glennerster, Andrew J Vis Article When we move, the visual direction of objects in the environment can change substantially. Compared with our understanding of depth perception, the problem the visual system faces in computing this change is relatively poorly understood. Here, we tested the extent to which participants’ judgments of visual direction could be predicted by standard cue combination rules. Participants were tested in virtual reality using a head-mounted display. In a simulated room, they judged the position of an object at one location, before walking to another location in the room and judging, in a second interval, whether an object was at the expected visual direction of the first. By manipulating the scale of the room across intervals, which was subjectively invisible to observers, we put two classes of cue into conflict, one that depends only on visual information and one that uses proprioceptive information to scale any reconstruction of the scene. We find that the sensitivity to changes in one class of cue while keeping the other constant provides a good prediction of performance when both cues vary, consistent with the standard cue combination framework. Nevertheless, by comparing judgments of visual direction with those of distance, we show that judgments of visual direction and distance are mutually inconsistent. We discuss why there is no need for any contradiction between these two conclusions. The Association for Research in Vision and Ophthalmology 2021-04-26 /pmc/articles/PMC8083085/ /pubmed/33900366 http://dx.doi.org/10.1167/jov.21.4.10 Text en Copyright 2021 The Authors https://creativecommons.org/licenses/by/4.0/This work is licensed under a Creative Commons Attribution 4.0 International License. |
spellingShingle | Article Scarfe, Peter Glennerster, Andrew Combining cues to judge distance and direction in an immersive virtual reality environment |
title | Combining cues to judge distance and direction in an immersive virtual reality environment |
title_full | Combining cues to judge distance and direction in an immersive virtual reality environment |
title_fullStr | Combining cues to judge distance and direction in an immersive virtual reality environment |
title_full_unstemmed | Combining cues to judge distance and direction in an immersive virtual reality environment |
title_short | Combining cues to judge distance and direction in an immersive virtual reality environment |
title_sort | combining cues to judge distance and direction in an immersive virtual reality environment |
topic | Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8083085/ https://www.ncbi.nlm.nih.gov/pubmed/33900366 http://dx.doi.org/10.1167/jov.21.4.10 |
work_keys_str_mv | AT scarfepeter combiningcuestojudgedistanceanddirectioninanimmersivevirtualrealityenvironment AT glennersterandrew combiningcuestojudgedistanceanddirectioninanimmersivevirtualrealityenvironment |