Cargando…

Measuring perceived self-location in virtual reality

Third-person perspective full-body illusions (3PP-FBI) enable the manipulation, through multisensory stimulation, of perceived self-location. Perceived self-location is classically measured by a locomotion task. Yet, as locomotion modulates various sensory signals, we developed in immersive virtual...

Descripción completa

Detalles Bibliográficos
Autores principales: Nakul, Estelle, Orlando-Dessaints, Nicolas, Lenggenhager, Bigna, Lopez, Christophe
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Nature Publishing Group UK 2020
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7176655/
https://www.ncbi.nlm.nih.gov/pubmed/32321976
http://dx.doi.org/10.1038/s41598-020-63643-y
_version_ 1783525050394083328
author Nakul, Estelle
Orlando-Dessaints, Nicolas
Lenggenhager, Bigna
Lopez, Christophe
author_facet Nakul, Estelle
Orlando-Dessaints, Nicolas
Lenggenhager, Bigna
Lopez, Christophe
author_sort Nakul, Estelle
collection PubMed
description Third-person perspective full-body illusions (3PP-FBI) enable the manipulation, through multisensory stimulation, of perceived self-location. Perceived self-location is classically measured by a locomotion task. Yet, as locomotion modulates various sensory signals, we developed in immersive virtual reality a measure of self-location without locomotion. Tactile stimulation was applied on the back of twenty-five participants and displayed synchronously or asynchronously on an avatar’s back seen from behind. Participants completed the locomotion task and a novel mental imagery task, in which they self-located in relation to a virtual ball approaching them. Participants self-identified with the avatar more during synchronous than asynchronous visuo-tactile stimulation in both tasks. This was accentuated for the mental imagery task, showing a larger self-relocation toward the avatar, together with higher reports of presence, bi-location and disembodiment in the synchronous condition only for the mental imagery task. In conclusion, the results suggest that avoiding multisensory updating during walking, and using a perceptual rather than a motor task, can improve measures of illusory self-location.
format Online
Article
Text
id pubmed-7176655
institution National Center for Biotechnology Information
language English
publishDate 2020
publisher Nature Publishing Group UK
record_format MEDLINE/PubMed
spelling pubmed-71766552020-04-27 Measuring perceived self-location in virtual reality Nakul, Estelle Orlando-Dessaints, Nicolas Lenggenhager, Bigna Lopez, Christophe Sci Rep Article Third-person perspective full-body illusions (3PP-FBI) enable the manipulation, through multisensory stimulation, of perceived self-location. Perceived self-location is classically measured by a locomotion task. Yet, as locomotion modulates various sensory signals, we developed in immersive virtual reality a measure of self-location without locomotion. Tactile stimulation was applied on the back of twenty-five participants and displayed synchronously or asynchronously on an avatar’s back seen from behind. Participants completed the locomotion task and a novel mental imagery task, in which they self-located in relation to a virtual ball approaching them. Participants self-identified with the avatar more during synchronous than asynchronous visuo-tactile stimulation in both tasks. This was accentuated for the mental imagery task, showing a larger self-relocation toward the avatar, together with higher reports of presence, bi-location and disembodiment in the synchronous condition only for the mental imagery task. In conclusion, the results suggest that avoiding multisensory updating during walking, and using a perceptual rather than a motor task, can improve measures of illusory self-location. Nature Publishing Group UK 2020-04-22 /pmc/articles/PMC7176655/ /pubmed/32321976 http://dx.doi.org/10.1038/s41598-020-63643-y Text en © The Author(s) 2020 Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/.
spellingShingle Article
Nakul, Estelle
Orlando-Dessaints, Nicolas
Lenggenhager, Bigna
Lopez, Christophe
Measuring perceived self-location in virtual reality
title Measuring perceived self-location in virtual reality
title_full Measuring perceived self-location in virtual reality
title_fullStr Measuring perceived self-location in virtual reality
title_full_unstemmed Measuring perceived self-location in virtual reality
title_short Measuring perceived self-location in virtual reality
title_sort measuring perceived self-location in virtual reality
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7176655/
https://www.ncbi.nlm.nih.gov/pubmed/32321976
http://dx.doi.org/10.1038/s41598-020-63643-y
work_keys_str_mv AT nakulestelle measuringperceivedselflocationinvirtualreality
AT orlandodessaintsnicolas measuringperceivedselflocationinvirtualreality
AT lenggenhagerbigna measuringperceivedselflocationinvirtualreality
AT lopezchristophe measuringperceivedselflocationinvirtualreality