Cargando…

Development of an electrooculogram-based human-computer interface using involuntary eye movement by spatially rotating sound for communication of locked-in patients

Individuals who have lost normal pathways for communication need augmentative and alternative communication (AAC) devices. In this study, we propose a new electrooculogram (EOG)-based human-computer interface (HCI) paradigm for AAC that does not require a user’s voluntary eye movement for binary yes...

Descripción completa

Detalles Bibliográficos
Autores principales: Kim, Do Yeon, Han, Chang-Hee, Im, Chang-Hwan
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Nature Publishing Group UK 2018
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6014992/
https://www.ncbi.nlm.nih.gov/pubmed/29934518
http://dx.doi.org/10.1038/s41598-018-27865-5
_version_ 1783334303163219968
author Kim, Do Yeon
Han, Chang-Hee
Im, Chang-Hwan
author_facet Kim, Do Yeon
Han, Chang-Hee
Im, Chang-Hwan
author_sort Kim, Do Yeon
collection PubMed
description Individuals who have lost normal pathways for communication need augmentative and alternative communication (AAC) devices. In this study, we propose a new electrooculogram (EOG)-based human-computer interface (HCI) paradigm for AAC that does not require a user’s voluntary eye movement for binary yes/no communication by patients in locked-in state (LIS). The proposed HCI uses a horizontal EOG elicited by involuntary auditory oculogyric reflex, in response to a rotating sound source. In the proposed HCI paradigm, a user was asked to selectively attend to one of two sound sources rotating in directions opposite to each other, based on the user’s intention. The user’s intentions could then be recognised by quantifying EOGs. To validate its performance, a series of experiments was conducted with ten healthy subjects, and two patients with amyotrophic lateral sclerosis (ALS). The online experimental results exhibited high-classification accuracies of 94% in both healthy subjects and ALS patients in cases where decisions were made every six seconds. The ALS patients also participated in a practical yes/no communication experiment with 26 or 30 questions with known answers. The accuracy of the experiments with questionnaires was 94%, demonstrating that our paradigm could constitute an auxiliary AAC system for some LIS patients.
format Online
Article
Text
id pubmed-6014992
institution National Center for Biotechnology Information
language English
publishDate 2018
publisher Nature Publishing Group UK
record_format MEDLINE/PubMed
spelling pubmed-60149922018-07-06 Development of an electrooculogram-based human-computer interface using involuntary eye movement by spatially rotating sound for communication of locked-in patients Kim, Do Yeon Han, Chang-Hee Im, Chang-Hwan Sci Rep Article Individuals who have lost normal pathways for communication need augmentative and alternative communication (AAC) devices. In this study, we propose a new electrooculogram (EOG)-based human-computer interface (HCI) paradigm for AAC that does not require a user’s voluntary eye movement for binary yes/no communication by patients in locked-in state (LIS). The proposed HCI uses a horizontal EOG elicited by involuntary auditory oculogyric reflex, in response to a rotating sound source. In the proposed HCI paradigm, a user was asked to selectively attend to one of two sound sources rotating in directions opposite to each other, based on the user’s intention. The user’s intentions could then be recognised by quantifying EOGs. To validate its performance, a series of experiments was conducted with ten healthy subjects, and two patients with amyotrophic lateral sclerosis (ALS). The online experimental results exhibited high-classification accuracies of 94% in both healthy subjects and ALS patients in cases where decisions were made every six seconds. The ALS patients also participated in a practical yes/no communication experiment with 26 or 30 questions with known answers. The accuracy of the experiments with questionnaires was 94%, demonstrating that our paradigm could constitute an auxiliary AAC system for some LIS patients. Nature Publishing Group UK 2018-06-22 /pmc/articles/PMC6014992/ /pubmed/29934518 http://dx.doi.org/10.1038/s41598-018-27865-5 Text en © The Author(s) 2018 Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/.
spellingShingle Article
Kim, Do Yeon
Han, Chang-Hee
Im, Chang-Hwan
Development of an electrooculogram-based human-computer interface using involuntary eye movement by spatially rotating sound for communication of locked-in patients
title Development of an electrooculogram-based human-computer interface using involuntary eye movement by spatially rotating sound for communication of locked-in patients
title_full Development of an electrooculogram-based human-computer interface using involuntary eye movement by spatially rotating sound for communication of locked-in patients
title_fullStr Development of an electrooculogram-based human-computer interface using involuntary eye movement by spatially rotating sound for communication of locked-in patients
title_full_unstemmed Development of an electrooculogram-based human-computer interface using involuntary eye movement by spatially rotating sound for communication of locked-in patients
title_short Development of an electrooculogram-based human-computer interface using involuntary eye movement by spatially rotating sound for communication of locked-in patients
title_sort development of an electrooculogram-based human-computer interface using involuntary eye movement by spatially rotating sound for communication of locked-in patients
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6014992/
https://www.ncbi.nlm.nih.gov/pubmed/29934518
http://dx.doi.org/10.1038/s41598-018-27865-5
work_keys_str_mv AT kimdoyeon developmentofanelectrooculogrambasedhumancomputerinterfaceusinginvoluntaryeyemovementbyspatiallyrotatingsoundforcommunicationoflockedinpatients
AT hanchanghee developmentofanelectrooculogrambasedhumancomputerinterfaceusinginvoluntaryeyemovementbyspatiallyrotatingsoundforcommunicationoflockedinpatients
AT imchanghwan developmentofanelectrooculogrambasedhumancomputerinterfaceusinginvoluntaryeyemovementbyspatiallyrotatingsoundforcommunicationoflockedinpatients