Cargando…
An EEG-based attention recognition method: fusion of time domain, frequency domain, and non-linear dynamics features
INTRODUCTION: Attention is a complex cognitive function of human brain that plays a vital role in our daily lives. Electroencephalogram (EEG) is used to measure and analyze attention due to its high temporal resolution. Although several attention recognition brain-computer interfaces (BCIs) have bee...
Autores principales: | , , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Frontiers Media S.A.
2023
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10368951/ https://www.ncbi.nlm.nih.gov/pubmed/37502681 http://dx.doi.org/10.3389/fnins.2023.1194554 |
_version_ | 1785077617024565248 |
---|---|
author | Chen, Di Huang, Haiyun Bao, Xiaoyu Pan, Jiahui Li, Yuanqing |
author_facet | Chen, Di Huang, Haiyun Bao, Xiaoyu Pan, Jiahui Li, Yuanqing |
author_sort | Chen, Di |
collection | PubMed |
description | INTRODUCTION: Attention is a complex cognitive function of human brain that plays a vital role in our daily lives. Electroencephalogram (EEG) is used to measure and analyze attention due to its high temporal resolution. Although several attention recognition brain-computer interfaces (BCIs) have been proposed, there is a scarcity of studies with a sufficient number of subjects, valid paradigms, and reliable recognition analysis across subjects. METHODS: In this study, we proposed a novel attention paradigm and feature fusion method to extract features, which fused time domain features, frequency domain features and nonlinear dynamics features. We then constructed an attention recognition framework for 85 subjects. RESULTS AND DISCUSSION: We achieved an intra-subject average classification accuracy of 85.05% ± 6.87% and an inter-subject average classification accuracy of 81.60% ± 9.93%, respectively. We further explored the neural patterns in attention recognition, where attention states showed less activation than non-attention states in the prefrontal and occipital areas in α, β and θ bands. The research explores, for the first time, the fusion of time domain features, frequency domain features and nonlinear dynamics features for attention recognition, providing a new understanding of attention recognition. |
format | Online Article Text |
id | pubmed-10368951 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2023 |
publisher | Frontiers Media S.A. |
record_format | MEDLINE/PubMed |
spelling | pubmed-103689512023-07-27 An EEG-based attention recognition method: fusion of time domain, frequency domain, and non-linear dynamics features Chen, Di Huang, Haiyun Bao, Xiaoyu Pan, Jiahui Li, Yuanqing Front Neurosci Neuroscience INTRODUCTION: Attention is a complex cognitive function of human brain that plays a vital role in our daily lives. Electroencephalogram (EEG) is used to measure and analyze attention due to its high temporal resolution. Although several attention recognition brain-computer interfaces (BCIs) have been proposed, there is a scarcity of studies with a sufficient number of subjects, valid paradigms, and reliable recognition analysis across subjects. METHODS: In this study, we proposed a novel attention paradigm and feature fusion method to extract features, which fused time domain features, frequency domain features and nonlinear dynamics features. We then constructed an attention recognition framework for 85 subjects. RESULTS AND DISCUSSION: We achieved an intra-subject average classification accuracy of 85.05% ± 6.87% and an inter-subject average classification accuracy of 81.60% ± 9.93%, respectively. We further explored the neural patterns in attention recognition, where attention states showed less activation than non-attention states in the prefrontal and occipital areas in α, β and θ bands. The research explores, for the first time, the fusion of time domain features, frequency domain features and nonlinear dynamics features for attention recognition, providing a new understanding of attention recognition. Frontiers Media S.A. 2023-07-12 /pmc/articles/PMC10368951/ /pubmed/37502681 http://dx.doi.org/10.3389/fnins.2023.1194554 Text en Copyright © 2023 Chen, Huang, Bao, Pan and Li. https://creativecommons.org/licenses/by/4.0/This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms. |
spellingShingle | Neuroscience Chen, Di Huang, Haiyun Bao, Xiaoyu Pan, Jiahui Li, Yuanqing An EEG-based attention recognition method: fusion of time domain, frequency domain, and non-linear dynamics features |
title | An EEG-based attention recognition method: fusion of time domain, frequency domain, and non-linear dynamics features |
title_full | An EEG-based attention recognition method: fusion of time domain, frequency domain, and non-linear dynamics features |
title_fullStr | An EEG-based attention recognition method: fusion of time domain, frequency domain, and non-linear dynamics features |
title_full_unstemmed | An EEG-based attention recognition method: fusion of time domain, frequency domain, and non-linear dynamics features |
title_short | An EEG-based attention recognition method: fusion of time domain, frequency domain, and non-linear dynamics features |
title_sort | eeg-based attention recognition method: fusion of time domain, frequency domain, and non-linear dynamics features |
topic | Neuroscience |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10368951/ https://www.ncbi.nlm.nih.gov/pubmed/37502681 http://dx.doi.org/10.3389/fnins.2023.1194554 |
work_keys_str_mv | AT chendi aneegbasedattentionrecognitionmethodfusionoftimedomainfrequencydomainandnonlineardynamicsfeatures AT huanghaiyun aneegbasedattentionrecognitionmethodfusionoftimedomainfrequencydomainandnonlineardynamicsfeatures AT baoxiaoyu aneegbasedattentionrecognitionmethodfusionoftimedomainfrequencydomainandnonlineardynamicsfeatures AT panjiahui aneegbasedattentionrecognitionmethodfusionoftimedomainfrequencydomainandnonlineardynamicsfeatures AT liyuanqing aneegbasedattentionrecognitionmethodfusionoftimedomainfrequencydomainandnonlineardynamicsfeatures AT chendi eegbasedattentionrecognitionmethodfusionoftimedomainfrequencydomainandnonlineardynamicsfeatures AT huanghaiyun eegbasedattentionrecognitionmethodfusionoftimedomainfrequencydomainandnonlineardynamicsfeatures AT baoxiaoyu eegbasedattentionrecognitionmethodfusionoftimedomainfrequencydomainandnonlineardynamicsfeatures AT panjiahui eegbasedattentionrecognitionmethodfusionoftimedomainfrequencydomainandnonlineardynamicsfeatures AT liyuanqing eegbasedattentionrecognitionmethodfusionoftimedomainfrequencydomainandnonlineardynamicsfeatures |