Cargando…

Modulation of Visually Evoked Postural Responses by Contextual Visual, Haptic and Auditory Information: A ‘Virtual Reality Check’

Externally generated visual motion signals can cause the illusion of self-motion in space (vection) and corresponding visually evoked postural responses (VEPR). These VEPRs are not simple responses to optokinetic stimulation, but are modulated by the configuration of the environment. The aim of this...

Descripción completa

Detalles Bibliográficos
Autores principales: Meyer, Georg F., Shao, Fei, White, Mark D., Hopkins, Carl, Robotham, Antony J.
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Public Library of Science 2013
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3695920/
https://www.ncbi.nlm.nih.gov/pubmed/23840760
http://dx.doi.org/10.1371/journal.pone.0067651
_version_ 1782275035228012544
author Meyer, Georg F.
Shao, Fei
White, Mark D.
Hopkins, Carl
Robotham, Antony J.
author_facet Meyer, Georg F.
Shao, Fei
White, Mark D.
Hopkins, Carl
Robotham, Antony J.
author_sort Meyer, Georg F.
collection PubMed
description Externally generated visual motion signals can cause the illusion of self-motion in space (vection) and corresponding visually evoked postural responses (VEPR). These VEPRs are not simple responses to optokinetic stimulation, but are modulated by the configuration of the environment. The aim of this paper is to explore what factors modulate VEPRs in a high quality virtual reality (VR) environment where real and virtual foreground objects served as static visual, auditory and haptic reference points. Data from four experiments on visually evoked postural responses show that: 1) visually evoked postural sway in the lateral direction is modulated by the presence of static anchor points that can be haptic, visual and auditory reference signals; 2) real objects and their matching virtual reality representations as visual anchors have different effects on postural sway; 3) visual motion in the anterior-posterior plane induces robust postural responses that are not modulated by the presence of reference signals or the reality of objects that can serve as visual anchors in the scene. We conclude that automatic postural responses for laterally moving visual stimuli are strongly influenced by the configuration and interpretation of the environment and draw on multisensory representations. Different postural responses were observed for real and virtual visual reference objects. On the basis that automatic visually evoked postural responses in high fidelity virtual environments should mimic those seen in real situations we propose to use the observed effect as a robust objective test for presence and fidelity in VR.
format Online
Article
Text
id pubmed-3695920
institution National Center for Biotechnology Information
language English
publishDate 2013
publisher Public Library of Science
record_format MEDLINE/PubMed
spelling pubmed-36959202013-07-09 Modulation of Visually Evoked Postural Responses by Contextual Visual, Haptic and Auditory Information: A ‘Virtual Reality Check’ Meyer, Georg F. Shao, Fei White, Mark D. Hopkins, Carl Robotham, Antony J. PLoS One Research Article Externally generated visual motion signals can cause the illusion of self-motion in space (vection) and corresponding visually evoked postural responses (VEPR). These VEPRs are not simple responses to optokinetic stimulation, but are modulated by the configuration of the environment. The aim of this paper is to explore what factors modulate VEPRs in a high quality virtual reality (VR) environment where real and virtual foreground objects served as static visual, auditory and haptic reference points. Data from four experiments on visually evoked postural responses show that: 1) visually evoked postural sway in the lateral direction is modulated by the presence of static anchor points that can be haptic, visual and auditory reference signals; 2) real objects and their matching virtual reality representations as visual anchors have different effects on postural sway; 3) visual motion in the anterior-posterior plane induces robust postural responses that are not modulated by the presence of reference signals or the reality of objects that can serve as visual anchors in the scene. We conclude that automatic postural responses for laterally moving visual stimuli are strongly influenced by the configuration and interpretation of the environment and draw on multisensory representations. Different postural responses were observed for real and virtual visual reference objects. On the basis that automatic visually evoked postural responses in high fidelity virtual environments should mimic those seen in real situations we propose to use the observed effect as a robust objective test for presence and fidelity in VR. Public Library of Science 2013-06-28 /pmc/articles/PMC3695920/ /pubmed/23840760 http://dx.doi.org/10.1371/journal.pone.0067651 Text en © 2013 Meyer et al http://creativecommons.org/licenses/by/4.0/ This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are properly credited.
spellingShingle Research Article
Meyer, Georg F.
Shao, Fei
White, Mark D.
Hopkins, Carl
Robotham, Antony J.
Modulation of Visually Evoked Postural Responses by Contextual Visual, Haptic and Auditory Information: A ‘Virtual Reality Check’
title Modulation of Visually Evoked Postural Responses by Contextual Visual, Haptic and Auditory Information: A ‘Virtual Reality Check’
title_full Modulation of Visually Evoked Postural Responses by Contextual Visual, Haptic and Auditory Information: A ‘Virtual Reality Check’
title_fullStr Modulation of Visually Evoked Postural Responses by Contextual Visual, Haptic and Auditory Information: A ‘Virtual Reality Check’
title_full_unstemmed Modulation of Visually Evoked Postural Responses by Contextual Visual, Haptic and Auditory Information: A ‘Virtual Reality Check’
title_short Modulation of Visually Evoked Postural Responses by Contextual Visual, Haptic and Auditory Information: A ‘Virtual Reality Check’
title_sort modulation of visually evoked postural responses by contextual visual, haptic and auditory information: a ‘virtual reality check’
topic Research Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3695920/
https://www.ncbi.nlm.nih.gov/pubmed/23840760
http://dx.doi.org/10.1371/journal.pone.0067651
work_keys_str_mv AT meyergeorgf modulationofvisuallyevokedposturalresponsesbycontextualvisualhapticandauditoryinformationavirtualrealitycheck
AT shaofei modulationofvisuallyevokedposturalresponsesbycontextualvisualhapticandauditoryinformationavirtualrealitycheck
AT whitemarkd modulationofvisuallyevokedposturalresponsesbycontextualvisualhapticandauditoryinformationavirtualrealitycheck
AT hopkinscarl modulationofvisuallyevokedposturalresponsesbycontextualvisualhapticandauditoryinformationavirtualrealitycheck
AT robothamantonyj modulationofvisuallyevokedposturalresponsesbycontextualvisualhapticandauditoryinformationavirtualrealitycheck