Cargando…
Robotic and Virtual Reality BCIs Using Spatial Tactile and Auditory Oddball Paradigms
The paper reviews nine robotic and virtual reality (VR) brain–computer interface (BCI) projects developed by the author, in collaboration with his graduate students, within the BCI–lab research group during its association with University of Tsukuba, Japan. The nine novel approaches are discussed in...
Autor principal: | |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Frontiers Media S.A.
2016
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5138204/ https://www.ncbi.nlm.nih.gov/pubmed/27999538 http://dx.doi.org/10.3389/fnbot.2016.00020 |
_version_ | 1782472020575911936 |
---|---|
author | Rutkowski, Tomasz M. |
author_facet | Rutkowski, Tomasz M. |
author_sort | Rutkowski, Tomasz M. |
collection | PubMed |
description | The paper reviews nine robotic and virtual reality (VR) brain–computer interface (BCI) projects developed by the author, in collaboration with his graduate students, within the BCI–lab research group during its association with University of Tsukuba, Japan. The nine novel approaches are discussed in applications to direct brain-robot and brain-virtual-reality-agent control interfaces using tactile and auditory BCI technologies. The BCI user intentions are decoded from the brainwaves in realtime using a non-invasive electroencephalography (EEG) and they are translated to a symbiotic robot or virtual reality agent thought-based only control. A communication protocol between the BCI output and the robot or the virtual environment is realized in a symbiotic communication scenario using an user datagram protocol (UDP), which constitutes an internet of things (IoT) control scenario. Results obtained from healthy users reproducing simple brain-robot and brain-virtual-agent control tasks in online experiments support the research goal of a possibility to interact with robotic devices and virtual reality agents using symbiotic thought-based BCI technologies. An offline BCI classification accuracy boosting method, using a previously proposed information geometry derived approach, is also discussed in order to further support the reviewed robotic and virtual reality thought-based control paradigms. |
format | Online Article Text |
id | pubmed-5138204 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2016 |
publisher | Frontiers Media S.A. |
record_format | MEDLINE/PubMed |
spelling | pubmed-51382042016-12-20 Robotic and Virtual Reality BCIs Using Spatial Tactile and Auditory Oddball Paradigms Rutkowski, Tomasz M. Front Neurorobot Neuroscience The paper reviews nine robotic and virtual reality (VR) brain–computer interface (BCI) projects developed by the author, in collaboration with his graduate students, within the BCI–lab research group during its association with University of Tsukuba, Japan. The nine novel approaches are discussed in applications to direct brain-robot and brain-virtual-reality-agent control interfaces using tactile and auditory BCI technologies. The BCI user intentions are decoded from the brainwaves in realtime using a non-invasive electroencephalography (EEG) and they are translated to a symbiotic robot or virtual reality agent thought-based only control. A communication protocol between the BCI output and the robot or the virtual environment is realized in a symbiotic communication scenario using an user datagram protocol (UDP), which constitutes an internet of things (IoT) control scenario. Results obtained from healthy users reproducing simple brain-robot and brain-virtual-agent control tasks in online experiments support the research goal of a possibility to interact with robotic devices and virtual reality agents using symbiotic thought-based BCI technologies. An offline BCI classification accuracy boosting method, using a previously proposed information geometry derived approach, is also discussed in order to further support the reviewed robotic and virtual reality thought-based control paradigms. Frontiers Media S.A. 2016-12-06 /pmc/articles/PMC5138204/ /pubmed/27999538 http://dx.doi.org/10.3389/fnbot.2016.00020 Text en Copyright © 2016 Rutkowski. http://creativecommons.org/licenses/by/4.0/ This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms. |
spellingShingle | Neuroscience Rutkowski, Tomasz M. Robotic and Virtual Reality BCIs Using Spatial Tactile and Auditory Oddball Paradigms |
title | Robotic and Virtual Reality BCIs Using Spatial Tactile and Auditory Oddball Paradigms |
title_full | Robotic and Virtual Reality BCIs Using Spatial Tactile and Auditory Oddball Paradigms |
title_fullStr | Robotic and Virtual Reality BCIs Using Spatial Tactile and Auditory Oddball Paradigms |
title_full_unstemmed | Robotic and Virtual Reality BCIs Using Spatial Tactile and Auditory Oddball Paradigms |
title_short | Robotic and Virtual Reality BCIs Using Spatial Tactile and Auditory Oddball Paradigms |
title_sort | robotic and virtual reality bcis using spatial tactile and auditory oddball paradigms |
topic | Neuroscience |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5138204/ https://www.ncbi.nlm.nih.gov/pubmed/27999538 http://dx.doi.org/10.3389/fnbot.2016.00020 |
work_keys_str_mv | AT rutkowskitomaszm roboticandvirtualrealitybcisusingspatialtactileandauditoryoddballparadigms |