Cargando…

Extending the Body to Virtual Tools Using a Robotic Surgical Interface: Evidence from the Crossmodal Congruency Task

The effects of real-world tool use on body or space representations are relatively well established in cognitive neuroscience. Several studies have shown, for example, that active tool use results in a facilitated integration of multisensory information in peripersonal space, i.e. the space directly...

Descripción completa

Detalles Bibliográficos
Autores principales: Sengül, Ali, van Elk, Michiel, Rognini, Giulio, Aspell, Jane Elizabeth, Bleuler, Hannes, Blanke, Olaf
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Public Library of Science 2012
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3515602/
https://www.ncbi.nlm.nih.gov/pubmed/23227142
http://dx.doi.org/10.1371/journal.pone.0049473
_version_ 1782252218452279296
author Sengül, Ali
van Elk, Michiel
Rognini, Giulio
Aspell, Jane Elizabeth
Bleuler, Hannes
Blanke, Olaf
author_facet Sengül, Ali
van Elk, Michiel
Rognini, Giulio
Aspell, Jane Elizabeth
Bleuler, Hannes
Blanke, Olaf
author_sort Sengül, Ali
collection PubMed
description The effects of real-world tool use on body or space representations are relatively well established in cognitive neuroscience. Several studies have shown, for example, that active tool use results in a facilitated integration of multisensory information in peripersonal space, i.e. the space directly surrounding the body. However, it remains unknown to what extent similar mechanisms apply to the use of virtual-robotic tools, such as those used in the field of surgical robotics, in which a surgeon may use bimanual haptic interfaces to control a surgery robot at a remote location. This paper presents two experiments in which participants used a haptic handle, originally designed for a commercial surgery robot, to control a virtual tool. The integration of multisensory information related to the virtual-robotic tool was assessed by means of the crossmodal congruency task, in which subjects responded to tactile vibrations applied to their fingers while ignoring visual distractors superimposed on the tip of the virtual-robotic tool. Our results show that active virtual-robotic tool use changes the spatial modulation of the crossmodal congruency effects, comparable to changes in the representation of peripersonal space observed during real-world tool use. Moreover, when the virtual-robotic tools were held in a crossed position, the visual distractors interfered strongly with tactile stimuli that was connected with the hand via the tool, reflecting a remapping of peripersonal space. Such remapping was not only observed when the virtual-robotic tools were actively used (Experiment 1), but also when passively held the tools (Experiment 2). The present study extends earlier findings on the extension of peripersonal space from physical and pointing tools to virtual-robotic tools using techniques from haptics and virtual reality. We discuss our data with respect to learning and human factors in the field of surgical robotics and discuss the use of new technologies in the field of cognitive neuroscience.
format Online
Article
Text
id pubmed-3515602
institution National Center for Biotechnology Information
language English
publishDate 2012
publisher Public Library of Science
record_format MEDLINE/PubMed
spelling pubmed-35156022012-12-07 Extending the Body to Virtual Tools Using a Robotic Surgical Interface: Evidence from the Crossmodal Congruency Task Sengül, Ali van Elk, Michiel Rognini, Giulio Aspell, Jane Elizabeth Bleuler, Hannes Blanke, Olaf PLoS One Research Article The effects of real-world tool use on body or space representations are relatively well established in cognitive neuroscience. Several studies have shown, for example, that active tool use results in a facilitated integration of multisensory information in peripersonal space, i.e. the space directly surrounding the body. However, it remains unknown to what extent similar mechanisms apply to the use of virtual-robotic tools, such as those used in the field of surgical robotics, in which a surgeon may use bimanual haptic interfaces to control a surgery robot at a remote location. This paper presents two experiments in which participants used a haptic handle, originally designed for a commercial surgery robot, to control a virtual tool. The integration of multisensory information related to the virtual-robotic tool was assessed by means of the crossmodal congruency task, in which subjects responded to tactile vibrations applied to their fingers while ignoring visual distractors superimposed on the tip of the virtual-robotic tool. Our results show that active virtual-robotic tool use changes the spatial modulation of the crossmodal congruency effects, comparable to changes in the representation of peripersonal space observed during real-world tool use. Moreover, when the virtual-robotic tools were held in a crossed position, the visual distractors interfered strongly with tactile stimuli that was connected with the hand via the tool, reflecting a remapping of peripersonal space. Such remapping was not only observed when the virtual-robotic tools were actively used (Experiment 1), but also when passively held the tools (Experiment 2). The present study extends earlier findings on the extension of peripersonal space from physical and pointing tools to virtual-robotic tools using techniques from haptics and virtual reality. We discuss our data with respect to learning and human factors in the field of surgical robotics and discuss the use of new technologies in the field of cognitive neuroscience. Public Library of Science 2012-12-05 /pmc/articles/PMC3515602/ /pubmed/23227142 http://dx.doi.org/10.1371/journal.pone.0049473 Text en © 2012 Sengül et al http://creativecommons.org/licenses/by/4.0/ This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are properly credited.
spellingShingle Research Article
Sengül, Ali
van Elk, Michiel
Rognini, Giulio
Aspell, Jane Elizabeth
Bleuler, Hannes
Blanke, Olaf
Extending the Body to Virtual Tools Using a Robotic Surgical Interface: Evidence from the Crossmodal Congruency Task
title Extending the Body to Virtual Tools Using a Robotic Surgical Interface: Evidence from the Crossmodal Congruency Task
title_full Extending the Body to Virtual Tools Using a Robotic Surgical Interface: Evidence from the Crossmodal Congruency Task
title_fullStr Extending the Body to Virtual Tools Using a Robotic Surgical Interface: Evidence from the Crossmodal Congruency Task
title_full_unstemmed Extending the Body to Virtual Tools Using a Robotic Surgical Interface: Evidence from the Crossmodal Congruency Task
title_short Extending the Body to Virtual Tools Using a Robotic Surgical Interface: Evidence from the Crossmodal Congruency Task
title_sort extending the body to virtual tools using a robotic surgical interface: evidence from the crossmodal congruency task
topic Research Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3515602/
https://www.ncbi.nlm.nih.gov/pubmed/23227142
http://dx.doi.org/10.1371/journal.pone.0049473
work_keys_str_mv AT sengulali extendingthebodytovirtualtoolsusingaroboticsurgicalinterfaceevidencefromthecrossmodalcongruencytask
AT vanelkmichiel extendingthebodytovirtualtoolsusingaroboticsurgicalinterfaceevidencefromthecrossmodalcongruencytask
AT rogninigiulio extendingthebodytovirtualtoolsusingaroboticsurgicalinterfaceevidencefromthecrossmodalcongruencytask
AT aspelljaneelizabeth extendingthebodytovirtualtoolsusingaroboticsurgicalinterfaceevidencefromthecrossmodalcongruencytask
AT bleulerhannes extendingthebodytovirtualtoolsusingaroboticsurgicalinterfaceevidencefromthecrossmodalcongruencytask
AT blankeolaf extendingthebodytovirtualtoolsusingaroboticsurgicalinterfaceevidencefromthecrossmodalcongruencytask