Cargando…

A brain-computer interface for potential non-verbal facial communication based on EEG signals related to specific emotions

Unlike assistive technology for verbal communication, the brain-machine or brain-computer interface (BMI/BCI) has not been established as a non-verbal communication tool for amyotrophic lateral sclerosis (ALS) patients. Face-to-face communication enables access to rich emotional information, but ind...

Descripción completa

Detalles Bibliográficos
Autor principal: Kashihara, Koji
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Frontiers Media S.A. 2014
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4144423/
https://www.ncbi.nlm.nih.gov/pubmed/25206321
http://dx.doi.org/10.3389/fnins.2014.00244
_version_ 1782332053077884928
author Kashihara, Koji
author_facet Kashihara, Koji
author_sort Kashihara, Koji
collection PubMed
description Unlike assistive technology for verbal communication, the brain-machine or brain-computer interface (BMI/BCI) has not been established as a non-verbal communication tool for amyotrophic lateral sclerosis (ALS) patients. Face-to-face communication enables access to rich emotional information, but individuals suffering from neurological disorders, such as ALS and autism, may not express their emotions or communicate their negative feelings. Although emotions may be inferred by looking at facial expressions, emotional prediction for neutral faces necessitates advanced judgment. The process that underlies brain neuronal responses to neutral faces and causes emotional changes remains unknown. To address this problem, therefore, this study attempted to decode conditioned emotional reactions to neutral face stimuli. This direction was motivated by the assumption that if electroencephalogram (EEG) signals can be used to detect patients' emotional responses to specific inexpressive faces, the results could be incorporated into the design and development of BMI/BCI-based non-verbal communication tools. To these ends, this study investigated how a neutral face associated with a negative emotion modulates rapid central responses in face processing and then identified cortical activities. The conditioned neutral face-triggered event-related potentials that originated from the posterior temporal lobe statistically significantly changed during late face processing (600–700 ms) after stimulus, rather than in early face processing activities, such as P1 and N170 responses. Source localization revealed that the conditioned neutral faces increased activity in the right fusiform gyrus (FG). This study also developed an efficient method for detecting implicit negative emotional responses to specific faces by using EEG signals. A classification method based on a support vector machine enables the easy classification of neutral faces that trigger specific individual emotions. In accordance with this classification, a face on a computer morphs into a sad or displeased countenance. The proposed method could be incorporated as a part of non-verbal communication tools to enable emotional expression.
format Online
Article
Text
id pubmed-4144423
institution National Center for Biotechnology Information
language English
publishDate 2014
publisher Frontiers Media S.A.
record_format MEDLINE/PubMed
spelling pubmed-41444232014-09-09 A brain-computer interface for potential non-verbal facial communication based on EEG signals related to specific emotions Kashihara, Koji Front Neurosci Neuroscience Unlike assistive technology for verbal communication, the brain-machine or brain-computer interface (BMI/BCI) has not been established as a non-verbal communication tool for amyotrophic lateral sclerosis (ALS) patients. Face-to-face communication enables access to rich emotional information, but individuals suffering from neurological disorders, such as ALS and autism, may not express their emotions or communicate their negative feelings. Although emotions may be inferred by looking at facial expressions, emotional prediction for neutral faces necessitates advanced judgment. The process that underlies brain neuronal responses to neutral faces and causes emotional changes remains unknown. To address this problem, therefore, this study attempted to decode conditioned emotional reactions to neutral face stimuli. This direction was motivated by the assumption that if electroencephalogram (EEG) signals can be used to detect patients' emotional responses to specific inexpressive faces, the results could be incorporated into the design and development of BMI/BCI-based non-verbal communication tools. To these ends, this study investigated how a neutral face associated with a negative emotion modulates rapid central responses in face processing and then identified cortical activities. The conditioned neutral face-triggered event-related potentials that originated from the posterior temporal lobe statistically significantly changed during late face processing (600–700 ms) after stimulus, rather than in early face processing activities, such as P1 and N170 responses. Source localization revealed that the conditioned neutral faces increased activity in the right fusiform gyrus (FG). This study also developed an efficient method for detecting implicit negative emotional responses to specific faces by using EEG signals. A classification method based on a support vector machine enables the easy classification of neutral faces that trigger specific individual emotions. In accordance with this classification, a face on a computer morphs into a sad or displeased countenance. The proposed method could be incorporated as a part of non-verbal communication tools to enable emotional expression. Frontiers Media S.A. 2014-08-26 /pmc/articles/PMC4144423/ /pubmed/25206321 http://dx.doi.org/10.3389/fnins.2014.00244 Text en Copyright © 2014 Kashihara. http://creativecommons.org/licenses/by/3.0/ This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
spellingShingle Neuroscience
Kashihara, Koji
A brain-computer interface for potential non-verbal facial communication based on EEG signals related to specific emotions
title A brain-computer interface for potential non-verbal facial communication based on EEG signals related to specific emotions
title_full A brain-computer interface for potential non-verbal facial communication based on EEG signals related to specific emotions
title_fullStr A brain-computer interface for potential non-verbal facial communication based on EEG signals related to specific emotions
title_full_unstemmed A brain-computer interface for potential non-verbal facial communication based on EEG signals related to specific emotions
title_short A brain-computer interface for potential non-verbal facial communication based on EEG signals related to specific emotions
title_sort brain-computer interface for potential non-verbal facial communication based on eeg signals related to specific emotions
topic Neuroscience
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4144423/
https://www.ncbi.nlm.nih.gov/pubmed/25206321
http://dx.doi.org/10.3389/fnins.2014.00244
work_keys_str_mv AT kashiharakoji abraincomputerinterfaceforpotentialnonverbalfacialcommunicationbasedoneegsignalsrelatedtospecificemotions
AT kashiharakoji braincomputerinterfaceforpotentialnonverbalfacialcommunicationbasedoneegsignalsrelatedtospecificemotions