Cargando…
Investigating distortions in perceptual stability during different self-movements using virtual reality
Using immersive virtual reality (the HTC Vive Head Mounted Display), we measured both bias and sensitivity when making judgements about the scene stability of a target object during both active (self-propelled) and passive (experimenter-propelled) observer movements. This was repeated in the same gr...
Autores principales: | , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
SAGE Publications
2022
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9478599/ https://www.ncbi.nlm.nih.gov/pubmed/35946126 http://dx.doi.org/10.1177/03010066221116480 |
_version_ | 1784790607899656192 |
---|---|
author | Warren, Paul A. Bell, Graham Li, Yu |
author_facet | Warren, Paul A. Bell, Graham Li, Yu |
author_sort | Warren, Paul A. |
collection | PubMed |
description | Using immersive virtual reality (the HTC Vive Head Mounted Display), we measured both bias and sensitivity when making judgements about the scene stability of a target object during both active (self-propelled) and passive (experimenter-propelled) observer movements. This was repeated in the same group of 16 participants for three different observer-target movement conditions in which the instability of a target was yoked to the movement of the observer. We found that in all movement conditions that the target needed to move with (in the same direction) as the participant to be perceived as scene-stable. Consistent with the presence of additional available information (efference copy) about self-movement during active conditions, biases were smaller and sensitivities to instability were higher in these relative to passive conditions. However, the presence of efference copy was clearly not sufficient to completely eliminate the bias and we suggest that the presence of additional visual information about self-movement is also critical. We found some (albeit limited) evidence for correlation between appropriate metrics across different movement conditions. These results extend previous findings, providing evidence for consistency of biases across different movement types, suggestive of common processing underpinning perceptual stability judgements. |
format | Online Article Text |
id | pubmed-9478599 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2022 |
publisher | SAGE Publications |
record_format | MEDLINE/PubMed |
spelling | pubmed-94785992022-09-17 Investigating distortions in perceptual stability during different self-movements using virtual reality Warren, Paul A. Bell, Graham Li, Yu Perception Articles Using immersive virtual reality (the HTC Vive Head Mounted Display), we measured both bias and sensitivity when making judgements about the scene stability of a target object during both active (self-propelled) and passive (experimenter-propelled) observer movements. This was repeated in the same group of 16 participants for three different observer-target movement conditions in which the instability of a target was yoked to the movement of the observer. We found that in all movement conditions that the target needed to move with (in the same direction) as the participant to be perceived as scene-stable. Consistent with the presence of additional available information (efference copy) about self-movement during active conditions, biases were smaller and sensitivities to instability were higher in these relative to passive conditions. However, the presence of efference copy was clearly not sufficient to completely eliminate the bias and we suggest that the presence of additional visual information about self-movement is also critical. We found some (albeit limited) evidence for correlation between appropriate metrics across different movement conditions. These results extend previous findings, providing evidence for consistency of biases across different movement types, suggestive of common processing underpinning perceptual stability judgements. SAGE Publications 2022-08-09 2022-10 /pmc/articles/PMC9478599/ /pubmed/35946126 http://dx.doi.org/10.1177/03010066221116480 Text en © The Author(s) 2022 https://creativecommons.org/licenses/by/4.0/This article is distributed under the terms of the Creative Commons Attribution 4.0 License (https://creativecommons.org/licenses/by/4.0/) which permits any use, reproduction and distribution of the work without further permission provided the original work is attributed as specified on the SAGE and Open Access page (https://us.sagepub.com/en-us/nam/open-access-at-sage). |
spellingShingle | Articles Warren, Paul A. Bell, Graham Li, Yu Investigating distortions in perceptual stability during different self-movements using virtual reality |
title | Investigating distortions in perceptual stability during different
self-movements using virtual reality |
title_full | Investigating distortions in perceptual stability during different
self-movements using virtual reality |
title_fullStr | Investigating distortions in perceptual stability during different
self-movements using virtual reality |
title_full_unstemmed | Investigating distortions in perceptual stability during different
self-movements using virtual reality |
title_short | Investigating distortions in perceptual stability during different
self-movements using virtual reality |
title_sort | investigating distortions in perceptual stability during different
self-movements using virtual reality |
topic | Articles |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9478599/ https://www.ncbi.nlm.nih.gov/pubmed/35946126 http://dx.doi.org/10.1177/03010066221116480 |
work_keys_str_mv | AT warrenpaula investigatingdistortionsinperceptualstabilityduringdifferentselfmovementsusingvirtualreality AT bellgraham investigatingdistortionsinperceptualstabilityduringdifferentselfmovementsusingvirtualreality AT liyu investigatingdistortionsinperceptualstabilityduringdifferentselfmovementsusingvirtualreality |