Cargando…
Brain-computer interface for robot control with eye artifacts for assistive applications
Human-robot interaction is a rapidly developing field and robots have been taking more active roles in our daily lives. Patient care is one of the fields in which robots are becoming more present, especially for people with disabilities. People with neurodegenerative disorders might not consciously...
Autores principales: | , , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Nature Publishing Group UK
2023
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10579221/ https://www.ncbi.nlm.nih.gov/pubmed/37845318 http://dx.doi.org/10.1038/s41598-023-44645-y |
_version_ | 1785121678623244288 |
---|---|
author | Karas, Kaan Pozzi, Luca Pedrocchi, Alessandra Braghin, Francesco Roveda, Loris |
author_facet | Karas, Kaan Pozzi, Luca Pedrocchi, Alessandra Braghin, Francesco Roveda, Loris |
author_sort | Karas, Kaan |
collection | PubMed |
description | Human-robot interaction is a rapidly developing field and robots have been taking more active roles in our daily lives. Patient care is one of the fields in which robots are becoming more present, especially for people with disabilities. People with neurodegenerative disorders might not consciously or voluntarily produce movements other than those involving the eyes or eyelids. In this context, Brain-Computer Interface (BCI) systems present an alternative way to communicate or interact with the external world. In order to improve the lives of people with disabilities, this paper presents a novel BCI to control an assistive robot with user’s eye artifacts. In this study, eye artifacts that contaminate the electroencephalogram (EEG) signals are considered a valuable source of information thanks to their high signal-to-noise ratio and intentional generation. The proposed methodology detects eye artifacts from EEG signals through characteristic shapes that occur during the events. The lateral movements are distinguished by their ordered peak and valley formation and the opposite phase of the signals measured at F7 and F8 channels. This work, as far as the authors’ knowledge, is the first method that used this behavior to detect lateral eye movements. For the blinks detection, a double-thresholding method is proposed by the authors to catch both weak blinks as well as regular ones, differentiating itself from the other algorithms in the literature that normally use only one threshold. Real-time detected events with their virtual time stamps are fed into a second algorithm, to further distinguish between double and quadruple blinks from single blinks occurrence frequency. After testing the algorithm offline and in realtime, the algorithm is implemented on the device. The created BCI was used to control an assistive robot through a graphical user interface. The validation experiments including 5 participants prove that the developed BCI is able to control the robot. |
format | Online Article Text |
id | pubmed-10579221 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2023 |
publisher | Nature Publishing Group UK |
record_format | MEDLINE/PubMed |
spelling | pubmed-105792212023-10-18 Brain-computer interface for robot control with eye artifacts for assistive applications Karas, Kaan Pozzi, Luca Pedrocchi, Alessandra Braghin, Francesco Roveda, Loris Sci Rep Article Human-robot interaction is a rapidly developing field and robots have been taking more active roles in our daily lives. Patient care is one of the fields in which robots are becoming more present, especially for people with disabilities. People with neurodegenerative disorders might not consciously or voluntarily produce movements other than those involving the eyes or eyelids. In this context, Brain-Computer Interface (BCI) systems present an alternative way to communicate or interact with the external world. In order to improve the lives of people with disabilities, this paper presents a novel BCI to control an assistive robot with user’s eye artifacts. In this study, eye artifacts that contaminate the electroencephalogram (EEG) signals are considered a valuable source of information thanks to their high signal-to-noise ratio and intentional generation. The proposed methodology detects eye artifacts from EEG signals through characteristic shapes that occur during the events. The lateral movements are distinguished by their ordered peak and valley formation and the opposite phase of the signals measured at F7 and F8 channels. This work, as far as the authors’ knowledge, is the first method that used this behavior to detect lateral eye movements. For the blinks detection, a double-thresholding method is proposed by the authors to catch both weak blinks as well as regular ones, differentiating itself from the other algorithms in the literature that normally use only one threshold. Real-time detected events with their virtual time stamps are fed into a second algorithm, to further distinguish between double and quadruple blinks from single blinks occurrence frequency. After testing the algorithm offline and in realtime, the algorithm is implemented on the device. The created BCI was used to control an assistive robot through a graphical user interface. The validation experiments including 5 participants prove that the developed BCI is able to control the robot. Nature Publishing Group UK 2023-10-16 /pmc/articles/PMC10579221/ /pubmed/37845318 http://dx.doi.org/10.1038/s41598-023-44645-y Text en © The Author(s) 2023 https://creativecommons.org/licenses/by/4.0/Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ (https://creativecommons.org/licenses/by/4.0/) . |
spellingShingle | Article Karas, Kaan Pozzi, Luca Pedrocchi, Alessandra Braghin, Francesco Roveda, Loris Brain-computer interface for robot control with eye artifacts for assistive applications |
title | Brain-computer interface for robot control with eye artifacts for assistive applications |
title_full | Brain-computer interface for robot control with eye artifacts for assistive applications |
title_fullStr | Brain-computer interface for robot control with eye artifacts for assistive applications |
title_full_unstemmed | Brain-computer interface for robot control with eye artifacts for assistive applications |
title_short | Brain-computer interface for robot control with eye artifacts for assistive applications |
title_sort | brain-computer interface for robot control with eye artifacts for assistive applications |
topic | Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10579221/ https://www.ncbi.nlm.nih.gov/pubmed/37845318 http://dx.doi.org/10.1038/s41598-023-44645-y |
work_keys_str_mv | AT karaskaan braincomputerinterfaceforrobotcontrolwitheyeartifactsforassistiveapplications AT pozziluca braincomputerinterfaceforrobotcontrolwitheyeartifactsforassistiveapplications AT pedrocchialessandra braincomputerinterfaceforrobotcontrolwitheyeartifactsforassistiveapplications AT braghinfrancesco braincomputerinterfaceforrobotcontrolwitheyeartifactsforassistiveapplications AT rovedaloris braincomputerinterfaceforrobotcontrolwitheyeartifactsforassistiveapplications |