Cargando…

MPI CyberMotion Simulator: Implementation of a Novel Motion Simulator to Investigate Multisensory Path Integration in Three Dimensions

Path integration is a process in which self-motion is integrated over time to obtain an estimate of one's current position relative to a starting point (1). Humans can do path integration based exclusively on visual (2-3), auditory (4), or inertial cues (5). However, with multiple cues present,...

Descripción completa

Detalles Bibliográficos
Autores principales: Barnett-Cowan, Michael, Meilinger, Tobias, Vidal, Manuel, Teufel, Harald, Bülthoff, Heinrich H.
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MyJove Corporation 2012
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3468186/
https://www.ncbi.nlm.nih.gov/pubmed/22617497
http://dx.doi.org/10.3791/3436
_version_ 1782245923371352064
author Barnett-Cowan, Michael
Meilinger, Tobias
Vidal, Manuel
Teufel, Harald
Bülthoff, Heinrich H.
author_facet Barnett-Cowan, Michael
Meilinger, Tobias
Vidal, Manuel
Teufel, Harald
Bülthoff, Heinrich H.
author_sort Barnett-Cowan, Michael
collection PubMed
description Path integration is a process in which self-motion is integrated over time to obtain an estimate of one's current position relative to a starting point (1). Humans can do path integration based exclusively on visual (2-3), auditory (4), or inertial cues (5). However, with multiple cues present, inertial cues - particularly kinaesthetic - seem to dominate (6-7). In the absence of vision, humans tend to overestimate short distances (<5 m) and turning angles (<30°), but underestimate longer ones (5). Movement through physical space therefore does not seem to be accurately represented by the brain. Extensive work has been done on evaluating path integration in the horizontal plane, but little is known about vertical movement (see (3) for virtual movement from vision alone). One reason for this is that traditional motion simulators have a small range of motion restricted mainly to the horizontal plane. Here we take advantage of a motion simulator (8-9) with a large range of motion to assess whether path integration is similar between horizontal and vertical planes. The relative contributions of inertial and visual cues for path navigation were also assessed. 16 observers sat upright in a seat mounted to the flange of a modified KUKA anthropomorphic robot arm. Sensory information was manipulated by providing visual (optic flow, limited lifetime star field), vestibular-kinaesthetic (passive self motion with eyes closed), or visual and vestibular-kinaesthetic motion cues. Movement trajectories in the horizontal, sagittal and frontal planes consisted of two segment lengths (1st: 0.4 m, 2nd: 1 m; ±0.24 m/s(2) peak acceleration). The angle of the two segments was either 45° or 90°. Observers pointed back to their origin by moving an arrow that was superimposed on an avatar presented on the screen. Observers were more likely to underestimate angle size for movement in the horizontal plane compared to the vertical planes. In the frontal plane observers were more likely to overestimate angle size while there was no such bias in the sagittal plane. Finally, observers responded slower when answering based on vestibular-kinaesthetic information alone. Human path integration based on vestibular-kinaesthetic information alone thus takes longer than when visual information is present. That pointing is consistent with underestimating and overestimating the angle one has moved through in the horizontal and vertical planes respectively, suggests that the neural representation of self-motion through space is non-symmetrical which may relate to the fact that humans experience movement mostly within the horizontal plane.
format Online
Article
Text
id pubmed-3468186
institution National Center for Biotechnology Information
language English
publishDate 2012
publisher MyJove Corporation
record_format MEDLINE/PubMed
spelling pubmed-34681862012-10-11 MPI CyberMotion Simulator: Implementation of a Novel Motion Simulator to Investigate Multisensory Path Integration in Three Dimensions Barnett-Cowan, Michael Meilinger, Tobias Vidal, Manuel Teufel, Harald Bülthoff, Heinrich H. J Vis Exp Neuroscience Path integration is a process in which self-motion is integrated over time to obtain an estimate of one's current position relative to a starting point (1). Humans can do path integration based exclusively on visual (2-3), auditory (4), or inertial cues (5). However, with multiple cues present, inertial cues - particularly kinaesthetic - seem to dominate (6-7). In the absence of vision, humans tend to overestimate short distances (<5 m) and turning angles (<30°), but underestimate longer ones (5). Movement through physical space therefore does not seem to be accurately represented by the brain. Extensive work has been done on evaluating path integration in the horizontal plane, but little is known about vertical movement (see (3) for virtual movement from vision alone). One reason for this is that traditional motion simulators have a small range of motion restricted mainly to the horizontal plane. Here we take advantage of a motion simulator (8-9) with a large range of motion to assess whether path integration is similar between horizontal and vertical planes. The relative contributions of inertial and visual cues for path navigation were also assessed. 16 observers sat upright in a seat mounted to the flange of a modified KUKA anthropomorphic robot arm. Sensory information was manipulated by providing visual (optic flow, limited lifetime star field), vestibular-kinaesthetic (passive self motion with eyes closed), or visual and vestibular-kinaesthetic motion cues. Movement trajectories in the horizontal, sagittal and frontal planes consisted of two segment lengths (1st: 0.4 m, 2nd: 1 m; ±0.24 m/s(2) peak acceleration). The angle of the two segments was either 45° or 90°. Observers pointed back to their origin by moving an arrow that was superimposed on an avatar presented on the screen. Observers were more likely to underestimate angle size for movement in the horizontal plane compared to the vertical planes. In the frontal plane observers were more likely to overestimate angle size while there was no such bias in the sagittal plane. Finally, observers responded slower when answering based on vestibular-kinaesthetic information alone. Human path integration based on vestibular-kinaesthetic information alone thus takes longer than when visual information is present. That pointing is consistent with underestimating and overestimating the angle one has moved through in the horizontal and vertical planes respectively, suggests that the neural representation of self-motion through space is non-symmetrical which may relate to the fact that humans experience movement mostly within the horizontal plane. MyJove Corporation 2012-05-10 /pmc/articles/PMC3468186/ /pubmed/22617497 http://dx.doi.org/10.3791/3436 Text en Copyright © 2012, Journal of Visualized Experiments http://creativecommons.org/licenses/by-nc-nd/3.0/ This is an open-access article distributed under the terms of the Creative Commons Attribution-NonCommercial-NoDerivs 3.0 Unported License. To view a copy of this license, visithttp://creativecommons.org/licenses/by-nc-nd/3.0/
spellingShingle Neuroscience
Barnett-Cowan, Michael
Meilinger, Tobias
Vidal, Manuel
Teufel, Harald
Bülthoff, Heinrich H.
MPI CyberMotion Simulator: Implementation of a Novel Motion Simulator to Investigate Multisensory Path Integration in Three Dimensions
title MPI CyberMotion Simulator: Implementation of a Novel Motion Simulator to Investigate Multisensory Path Integration in Three Dimensions
title_full MPI CyberMotion Simulator: Implementation of a Novel Motion Simulator to Investigate Multisensory Path Integration in Three Dimensions
title_fullStr MPI CyberMotion Simulator: Implementation of a Novel Motion Simulator to Investigate Multisensory Path Integration in Three Dimensions
title_full_unstemmed MPI CyberMotion Simulator: Implementation of a Novel Motion Simulator to Investigate Multisensory Path Integration in Three Dimensions
title_short MPI CyberMotion Simulator: Implementation of a Novel Motion Simulator to Investigate Multisensory Path Integration in Three Dimensions
title_sort mpi cybermotion simulator: implementation of a novel motion simulator to investigate multisensory path integration in three dimensions
topic Neuroscience
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3468186/
https://www.ncbi.nlm.nih.gov/pubmed/22617497
http://dx.doi.org/10.3791/3436
work_keys_str_mv AT barnettcowanmichael mpicybermotionsimulatorimplementationofanovelmotionsimulatortoinvestigatemultisensorypathintegrationinthreedimensions
AT meilingertobias mpicybermotionsimulatorimplementationofanovelmotionsimulatortoinvestigatemultisensorypathintegrationinthreedimensions
AT vidalmanuel mpicybermotionsimulatorimplementationofanovelmotionsimulatortoinvestigatemultisensorypathintegrationinthreedimensions
AT teufelharald mpicybermotionsimulatorimplementationofanovelmotionsimulatortoinvestigatemultisensorypathintegrationinthreedimensions
AT bulthoffheinrichh mpicybermotionsimulatorimplementationofanovelmotionsimulatortoinvestigatemultisensorypathintegrationinthreedimensions