Cargando…

Multimodal signal dataset for 11 intuitive movement tasks from single upper extremity during multiple recording sessions

BACKGROUND: Non-invasive brain–computer interfaces (BCIs) have been developed for realizing natural bi-directional interaction between users and external robotic systems. However, the communication between users and BCI systems through artificial matching is a critical issue. Recently, BCIs have bee...

Descripción completa

Detalles Bibliográficos
Autores principales: Jeong, Ji-Hoon, Cho, Jeong-Hyun, Shim, Kyung-Hwan, Kwon, Byoung-Hee, Lee, Byeong-Hoo, Lee, Do-Yeun, Lee, Dae-Hyeok, Lee, Seong-Whan
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Oxford University Press 2020
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7539536/
https://www.ncbi.nlm.nih.gov/pubmed/33034634
http://dx.doi.org/10.1093/gigascience/giaa098
_version_ 1783591074476851200
author Jeong, Ji-Hoon
Cho, Jeong-Hyun
Shim, Kyung-Hwan
Kwon, Byoung-Hee
Lee, Byeong-Hoo
Lee, Do-Yeun
Lee, Dae-Hyeok
Lee, Seong-Whan
author_facet Jeong, Ji-Hoon
Cho, Jeong-Hyun
Shim, Kyung-Hwan
Kwon, Byoung-Hee
Lee, Byeong-Hoo
Lee, Do-Yeun
Lee, Dae-Hyeok
Lee, Seong-Whan
author_sort Jeong, Ji-Hoon
collection PubMed
description BACKGROUND: Non-invasive brain–computer interfaces (BCIs) have been developed for realizing natural bi-directional interaction between users and external robotic systems. However, the communication between users and BCI systems through artificial matching is a critical issue. Recently, BCIs have been developed to adopt intuitive decoding, which is the key to solving several problems such as a small number of classes and manually matching BCI commands with device control. Unfortunately, the advances in this area have been slow owing to the lack of large and uniform datasets. This study provides a large intuitive dataset for 11 different upper extremity movement tasks obtained during multiple recording sessions. The dataset includes 60-channel electroencephalography, 7-channel electromyography, and 4-channel electro-oculography of 25 healthy participants collected over 3-day sessions for a total of 82,500 trials across all the participants. FINDINGS: We validated our dataset via neurophysiological analysis. We observed clear sensorimotor de-/activation and spatial distribution related to real-movement and motor imagery, respectively. Furthermore, we demonstrated the consistency of the dataset by evaluating the classification performance of each session using a baseline machine learning method. CONCLUSIONS: The dataset includes the data of multiple recording sessions, various classes within the single upper extremity, and multimodal signals. This work can be used to (i) compare the brain activities associated with real movement and imagination, (ii) improve the decoding performance, and (iii) analyze the differences among recording sessions. Hence, this study, as a Data Note, has focused on collecting data required for further advances in the BCI technology.
format Online
Article
Text
id pubmed-7539536
institution National Center for Biotechnology Information
language English
publishDate 2020
publisher Oxford University Press
record_format MEDLINE/PubMed
spelling pubmed-75395362020-10-13 Multimodal signal dataset for 11 intuitive movement tasks from single upper extremity during multiple recording sessions Jeong, Ji-Hoon Cho, Jeong-Hyun Shim, Kyung-Hwan Kwon, Byoung-Hee Lee, Byeong-Hoo Lee, Do-Yeun Lee, Dae-Hyeok Lee, Seong-Whan Gigascience Data Note BACKGROUND: Non-invasive brain–computer interfaces (BCIs) have been developed for realizing natural bi-directional interaction between users and external robotic systems. However, the communication between users and BCI systems through artificial matching is a critical issue. Recently, BCIs have been developed to adopt intuitive decoding, which is the key to solving several problems such as a small number of classes and manually matching BCI commands with device control. Unfortunately, the advances in this area have been slow owing to the lack of large and uniform datasets. This study provides a large intuitive dataset for 11 different upper extremity movement tasks obtained during multiple recording sessions. The dataset includes 60-channel electroencephalography, 7-channel electromyography, and 4-channel electro-oculography of 25 healthy participants collected over 3-day sessions for a total of 82,500 trials across all the participants. FINDINGS: We validated our dataset via neurophysiological analysis. We observed clear sensorimotor de-/activation and spatial distribution related to real-movement and motor imagery, respectively. Furthermore, we demonstrated the consistency of the dataset by evaluating the classification performance of each session using a baseline machine learning method. CONCLUSIONS: The dataset includes the data of multiple recording sessions, various classes within the single upper extremity, and multimodal signals. This work can be used to (i) compare the brain activities associated with real movement and imagination, (ii) improve the decoding performance, and (iii) analyze the differences among recording sessions. Hence, this study, as a Data Note, has focused on collecting data required for further advances in the BCI technology. Oxford University Press 2020-10-07 /pmc/articles/PMC7539536/ /pubmed/33034634 http://dx.doi.org/10.1093/gigascience/giaa098 Text en © The Author(s) 2020. Published by Oxford University Press GigaScience. http://creativecommons.org/licenses/by/4.0/ This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted reuse, distribution, and reproduction in any medium, provided the original work is properly cited.
spellingShingle Data Note
Jeong, Ji-Hoon
Cho, Jeong-Hyun
Shim, Kyung-Hwan
Kwon, Byoung-Hee
Lee, Byeong-Hoo
Lee, Do-Yeun
Lee, Dae-Hyeok
Lee, Seong-Whan
Multimodal signal dataset for 11 intuitive movement tasks from single upper extremity during multiple recording sessions
title Multimodal signal dataset for 11 intuitive movement tasks from single upper extremity during multiple recording sessions
title_full Multimodal signal dataset for 11 intuitive movement tasks from single upper extremity during multiple recording sessions
title_fullStr Multimodal signal dataset for 11 intuitive movement tasks from single upper extremity during multiple recording sessions
title_full_unstemmed Multimodal signal dataset for 11 intuitive movement tasks from single upper extremity during multiple recording sessions
title_short Multimodal signal dataset for 11 intuitive movement tasks from single upper extremity during multiple recording sessions
title_sort multimodal signal dataset for 11 intuitive movement tasks from single upper extremity during multiple recording sessions
topic Data Note
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7539536/
https://www.ncbi.nlm.nih.gov/pubmed/33034634
http://dx.doi.org/10.1093/gigascience/giaa098
work_keys_str_mv AT jeongjihoon multimodalsignaldatasetfor11intuitivemovementtasksfromsingleupperextremityduringmultiplerecordingsessions
AT chojeonghyun multimodalsignaldatasetfor11intuitivemovementtasksfromsingleupperextremityduringmultiplerecordingsessions
AT shimkyunghwan multimodalsignaldatasetfor11intuitivemovementtasksfromsingleupperextremityduringmultiplerecordingsessions
AT kwonbyounghee multimodalsignaldatasetfor11intuitivemovementtasksfromsingleupperextremityduringmultiplerecordingsessions
AT leebyeonghoo multimodalsignaldatasetfor11intuitivemovementtasksfromsingleupperextremityduringmultiplerecordingsessions
AT leedoyeun multimodalsignaldatasetfor11intuitivemovementtasksfromsingleupperextremityduringmultiplerecordingsessions
AT leedaehyeok multimodalsignaldatasetfor11intuitivemovementtasksfromsingleupperextremityduringmultiplerecordingsessions
AT leeseongwhan multimodalsignaldatasetfor11intuitivemovementtasksfromsingleupperextremityduringmultiplerecordingsessions