Cargando…

Automatic subject-specific spatiotemporal feature selection for subject-independent affective BCI

The dimensionality of the spatially distributed channels and the temporal resolution of electroencephalogram (EEG) based brain-computer interfaces (BCI) undermine emotion recognition models. Thus, prior to modeling such data, as the final stage of the learning pipeline, adequate preprocessing, trans...

Descripción completa

Detalles Bibliográficos
Autores principales: Almarri, Badar, Rajasekaran, Sanguthevar, Huang, Chun-Hsi
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Public Library of Science 2021
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8389489/
https://www.ncbi.nlm.nih.gov/pubmed/34437542
http://dx.doi.org/10.1371/journal.pone.0253383
_version_ 1783742870431203328
author Almarri, Badar
Rajasekaran, Sanguthevar
Huang, Chun-Hsi
author_facet Almarri, Badar
Rajasekaran, Sanguthevar
Huang, Chun-Hsi
author_sort Almarri, Badar
collection PubMed
description The dimensionality of the spatially distributed channels and the temporal resolution of electroencephalogram (EEG) based brain-computer interfaces (BCI) undermine emotion recognition models. Thus, prior to modeling such data, as the final stage of the learning pipeline, adequate preprocessing, transforming, and extracting temporal (i.e., time-series signals) and spatial (i.e., electrode channels) features are essential phases to recognize underlying human emotions. Conventionally, inter-subject variations are dealt with by avoiding the sources of variation (e.g., outliers) or turning the problem into a subject-deponent. We address this issue by preserving and learning from individual particularities in response to affective stimuli. This paper investigates and proposes a subject-independent emotion recognition framework that mitigates the subject-to-subject variability in such systems. Using an unsupervised feature selection algorithm, we reduce the feature space that is extracted from time-series signals. For the spatial features, we propose a subject-specific unsupervised learning algorithm that learns from inter-channel co-activation online. We tested this framework on real EEG benchmarks, namely DEAP, MAHNOB-HCI, and DREAMER. We train and test the selection outcomes using nested cross-validation and a support vector machine (SVM). We compared our results with the state-of-the-art subject-independent algorithms. Our results show an enhanced performance by accurately classifying human affection (i.e., based on valence and arousal) by 16%–27% compared to other studies. This work not only outperforms other subject-independent studies reported in the literature but also proposes an online analysis solution to affection recognition.
format Online
Article
Text
id pubmed-8389489
institution National Center for Biotechnology Information
language English
publishDate 2021
publisher Public Library of Science
record_format MEDLINE/PubMed
spelling pubmed-83894892021-08-27 Automatic subject-specific spatiotemporal feature selection for subject-independent affective BCI Almarri, Badar Rajasekaran, Sanguthevar Huang, Chun-Hsi PLoS One Research Article The dimensionality of the spatially distributed channels and the temporal resolution of electroencephalogram (EEG) based brain-computer interfaces (BCI) undermine emotion recognition models. Thus, prior to modeling such data, as the final stage of the learning pipeline, adequate preprocessing, transforming, and extracting temporal (i.e., time-series signals) and spatial (i.e., electrode channels) features are essential phases to recognize underlying human emotions. Conventionally, inter-subject variations are dealt with by avoiding the sources of variation (e.g., outliers) or turning the problem into a subject-deponent. We address this issue by preserving and learning from individual particularities in response to affective stimuli. This paper investigates and proposes a subject-independent emotion recognition framework that mitigates the subject-to-subject variability in such systems. Using an unsupervised feature selection algorithm, we reduce the feature space that is extracted from time-series signals. For the spatial features, we propose a subject-specific unsupervised learning algorithm that learns from inter-channel co-activation online. We tested this framework on real EEG benchmarks, namely DEAP, MAHNOB-HCI, and DREAMER. We train and test the selection outcomes using nested cross-validation and a support vector machine (SVM). We compared our results with the state-of-the-art subject-independent algorithms. Our results show an enhanced performance by accurately classifying human affection (i.e., based on valence and arousal) by 16%–27% compared to other studies. This work not only outperforms other subject-independent studies reported in the literature but also proposes an online analysis solution to affection recognition. Public Library of Science 2021-08-26 /pmc/articles/PMC8389489/ /pubmed/34437542 http://dx.doi.org/10.1371/journal.pone.0253383 Text en © 2021 Almarri et al https://creativecommons.org/licenses/by/4.0/This is an open access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/) , which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.
spellingShingle Research Article
Almarri, Badar
Rajasekaran, Sanguthevar
Huang, Chun-Hsi
Automatic subject-specific spatiotemporal feature selection for subject-independent affective BCI
title Automatic subject-specific spatiotemporal feature selection for subject-independent affective BCI
title_full Automatic subject-specific spatiotemporal feature selection for subject-independent affective BCI
title_fullStr Automatic subject-specific spatiotemporal feature selection for subject-independent affective BCI
title_full_unstemmed Automatic subject-specific spatiotemporal feature selection for subject-independent affective BCI
title_short Automatic subject-specific spatiotemporal feature selection for subject-independent affective BCI
title_sort automatic subject-specific spatiotemporal feature selection for subject-independent affective bci
topic Research Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8389489/
https://www.ncbi.nlm.nih.gov/pubmed/34437542
http://dx.doi.org/10.1371/journal.pone.0253383
work_keys_str_mv AT almarribadar automaticsubjectspecificspatiotemporalfeatureselectionforsubjectindependentaffectivebci
AT rajasekaransanguthevar automaticsubjectspecificspatiotemporalfeatureselectionforsubjectindependentaffectivebci
AT huangchunhsi automaticsubjectspecificspatiotemporalfeatureselectionforsubjectindependentaffectivebci