Cargando…

The Oculus Rift: a cost-effective tool for studying visual-vestibular interactions in self-motion perception

For years now, virtual reality devices have been applied in the field of vision science in an attempt to improve our understanding of perceptual principles underlying the experience of self-motion. Some of this research has been concerned with exploring factors involved in the visually-induced illus...

Descripción completa

Detalles Bibliográficos
Autores principales: Kim, Juno, Chung, Charles Y. L., Nakamura, Shinji, Palmisano, Stephen, Khuu, Sieu K.
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Frontiers Media S.A. 2015
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4358060/
https://www.ncbi.nlm.nih.gov/pubmed/25821438
http://dx.doi.org/10.3389/fpsyg.2015.00248
_version_ 1782361239750443008
author Kim, Juno
Chung, Charles Y. L.
Nakamura, Shinji
Palmisano, Stephen
Khuu, Sieu K.
author_facet Kim, Juno
Chung, Charles Y. L.
Nakamura, Shinji
Palmisano, Stephen
Khuu, Sieu K.
author_sort Kim, Juno
collection PubMed
description For years now, virtual reality devices have been applied in the field of vision science in an attempt to improve our understanding of perceptual principles underlying the experience of self-motion. Some of this research has been concerned with exploring factors involved in the visually-induced illusory perception of self-motion, known as vection. We examined the usefulness of the cost-effective Oculus Rift in generating vection in seated observers. This device has the capacity to display optic flow in world coordinates by compensating for tracked changes in 3D head orientation. We measured vection strength in three conditions of visual compensation for head movement: compensated, uncompensated, and inversely compensated. During presentation of optic flow, the observer was instructed to make periodic head oscillations (±22° horizontal excursions at approximately 0.53 Hz). We found that vection was best in the compensated condition, and was weakest in the inversely compensated condition. Surprisingly, vection was always better in passive viewing conditions, compared with conditions where active head rotations were performed. These findings suggest that vection is highly dependent on interactions between visual, vestibular and proprioceptive information, and may be highly sensitive to limitations of temporal lag in visual-vestibular coupling using this system.
format Online
Article
Text
id pubmed-4358060
institution National Center for Biotechnology Information
language English
publishDate 2015
publisher Frontiers Media S.A.
record_format MEDLINE/PubMed
spelling pubmed-43580602015-03-27 The Oculus Rift: a cost-effective tool for studying visual-vestibular interactions in self-motion perception Kim, Juno Chung, Charles Y. L. Nakamura, Shinji Palmisano, Stephen Khuu, Sieu K. Front Psychol Psychology For years now, virtual reality devices have been applied in the field of vision science in an attempt to improve our understanding of perceptual principles underlying the experience of self-motion. Some of this research has been concerned with exploring factors involved in the visually-induced illusory perception of self-motion, known as vection. We examined the usefulness of the cost-effective Oculus Rift in generating vection in seated observers. This device has the capacity to display optic flow in world coordinates by compensating for tracked changes in 3D head orientation. We measured vection strength in three conditions of visual compensation for head movement: compensated, uncompensated, and inversely compensated. During presentation of optic flow, the observer was instructed to make periodic head oscillations (±22° horizontal excursions at approximately 0.53 Hz). We found that vection was best in the compensated condition, and was weakest in the inversely compensated condition. Surprisingly, vection was always better in passive viewing conditions, compared with conditions where active head rotations were performed. These findings suggest that vection is highly dependent on interactions between visual, vestibular and proprioceptive information, and may be highly sensitive to limitations of temporal lag in visual-vestibular coupling using this system. Frontiers Media S.A. 2015-03-13 /pmc/articles/PMC4358060/ /pubmed/25821438 http://dx.doi.org/10.3389/fpsyg.2015.00248 Text en Copyright © 2015 Kim, Chung, Nakamura, Palmisano and Khuu. http://creativecommons.org/licenses/by/4.0/ This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
spellingShingle Psychology
Kim, Juno
Chung, Charles Y. L.
Nakamura, Shinji
Palmisano, Stephen
Khuu, Sieu K.
The Oculus Rift: a cost-effective tool for studying visual-vestibular interactions in self-motion perception
title The Oculus Rift: a cost-effective tool for studying visual-vestibular interactions in self-motion perception
title_full The Oculus Rift: a cost-effective tool for studying visual-vestibular interactions in self-motion perception
title_fullStr The Oculus Rift: a cost-effective tool for studying visual-vestibular interactions in self-motion perception
title_full_unstemmed The Oculus Rift: a cost-effective tool for studying visual-vestibular interactions in self-motion perception
title_short The Oculus Rift: a cost-effective tool for studying visual-vestibular interactions in self-motion perception
title_sort oculus rift: a cost-effective tool for studying visual-vestibular interactions in self-motion perception
topic Psychology
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4358060/
https://www.ncbi.nlm.nih.gov/pubmed/25821438
http://dx.doi.org/10.3389/fpsyg.2015.00248
work_keys_str_mv AT kimjuno theoculusriftacosteffectivetoolforstudyingvisualvestibularinteractionsinselfmotionperception
AT chungcharlesyl theoculusriftacosteffectivetoolforstudyingvisualvestibularinteractionsinselfmotionperception
AT nakamurashinji theoculusriftacosteffectivetoolforstudyingvisualvestibularinteractionsinselfmotionperception
AT palmisanostephen theoculusriftacosteffectivetoolforstudyingvisualvestibularinteractionsinselfmotionperception
AT khuusieuk theoculusriftacosteffectivetoolforstudyingvisualvestibularinteractionsinselfmotionperception
AT kimjuno oculusriftacosteffectivetoolforstudyingvisualvestibularinteractionsinselfmotionperception
AT chungcharlesyl oculusriftacosteffectivetoolforstudyingvisualvestibularinteractionsinselfmotionperception
AT nakamurashinji oculusriftacosteffectivetoolforstudyingvisualvestibularinteractionsinselfmotionperception
AT palmisanostephen oculusriftacosteffectivetoolforstudyingvisualvestibularinteractionsinselfmotionperception
AT khuusieuk oculusriftacosteffectivetoolforstudyingvisualvestibularinteractionsinselfmotionperception