Cargando…
Sparse Logistic Regression With L(1/2) Penalty for Emotion Recognition in Electroencephalography Classification
Emotion recognition based on electroencephalography (EEG) signals is a current focus in brain-computer interface research. However, the classification of EEG is difficult owing to large amounts of data and high levels of noise. Therefore, it is important to determine how to effectively extract featu...
Autores principales: | , , , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Frontiers Media S.A.
2020
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7427509/ https://www.ncbi.nlm.nih.gov/pubmed/32848688 http://dx.doi.org/10.3389/fninf.2020.00029 |
_version_ | 1783570889841836032 |
---|---|
author | Chen, Dong-Wei Miao, Rui Deng, Zhao-Yong Lu, Yue-Yue Liang, Yong Huang, Lan |
author_facet | Chen, Dong-Wei Miao, Rui Deng, Zhao-Yong Lu, Yue-Yue Liang, Yong Huang, Lan |
author_sort | Chen, Dong-Wei |
collection | PubMed |
description | Emotion recognition based on electroencephalography (EEG) signals is a current focus in brain-computer interface research. However, the classification of EEG is difficult owing to large amounts of data and high levels of noise. Therefore, it is important to determine how to effectively extract features that include important information. Regularization, one of the effective methods for EEG signal processing, can effectively extract important features from the signal and has potential applications in EEG emotion recognition. Currently, the most popular regularization technique is Lasso (L(1)) and Ridge Regression (L(2)). In recent years, researchers have proposed many other regularization terms. In theory, L(q)-type regularization has a lower q value, which means that it can be used to find solutions with better sparsity. L(1/2) regularization is of L(q) type (0 < q < 1) and has been shown to have many attractive properties. In this work, we studied the L(1/2) penalty in sparse logistic regression for three-classification EEG emotion recognition, and used a coordinate descent algorithm and a univariate semi-threshold operator to implement L(1/2) penalty logistic regression. The experimental results on simulation and real data demonstrate that our proposed method is better than other existing regularization methods. Sparse logistic regression with L(1/2) penalty achieves higher classification accuracy than the conventional L(1), Ridge Regression, and Elastic Net regularization methods, using fewer but more informative EEG signals. This is very important for high-dimensional small-sample EEG data and can help researchers to reduce computational complexity and improve computational accuracy. Therefore, we propose that sparse logistic regression with the L(1/2) penalty is an effective technique for emotion recognition in practical classification problems. |
format | Online Article Text |
id | pubmed-7427509 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2020 |
publisher | Frontiers Media S.A. |
record_format | MEDLINE/PubMed |
spelling | pubmed-74275092020-08-25 Sparse Logistic Regression With L(1/2) Penalty for Emotion Recognition in Electroencephalography Classification Chen, Dong-Wei Miao, Rui Deng, Zhao-Yong Lu, Yue-Yue Liang, Yong Huang, Lan Front Neuroinform Neuroscience Emotion recognition based on electroencephalography (EEG) signals is a current focus in brain-computer interface research. However, the classification of EEG is difficult owing to large amounts of data and high levels of noise. Therefore, it is important to determine how to effectively extract features that include important information. Regularization, one of the effective methods for EEG signal processing, can effectively extract important features from the signal and has potential applications in EEG emotion recognition. Currently, the most popular regularization technique is Lasso (L(1)) and Ridge Regression (L(2)). In recent years, researchers have proposed many other regularization terms. In theory, L(q)-type regularization has a lower q value, which means that it can be used to find solutions with better sparsity. L(1/2) regularization is of L(q) type (0 < q < 1) and has been shown to have many attractive properties. In this work, we studied the L(1/2) penalty in sparse logistic regression for three-classification EEG emotion recognition, and used a coordinate descent algorithm and a univariate semi-threshold operator to implement L(1/2) penalty logistic regression. The experimental results on simulation and real data demonstrate that our proposed method is better than other existing regularization methods. Sparse logistic regression with L(1/2) penalty achieves higher classification accuracy than the conventional L(1), Ridge Regression, and Elastic Net regularization methods, using fewer but more informative EEG signals. This is very important for high-dimensional small-sample EEG data and can help researchers to reduce computational complexity and improve computational accuracy. Therefore, we propose that sparse logistic regression with the L(1/2) penalty is an effective technique for emotion recognition in practical classification problems. Frontiers Media S.A. 2020-08-07 /pmc/articles/PMC7427509/ /pubmed/32848688 http://dx.doi.org/10.3389/fninf.2020.00029 Text en Copyright © 2020 Chen, Miao, Deng, Lu, Liang and Huang. http://creativecommons.org/licenses/by/4.0/ This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms. |
spellingShingle | Neuroscience Chen, Dong-Wei Miao, Rui Deng, Zhao-Yong Lu, Yue-Yue Liang, Yong Huang, Lan Sparse Logistic Regression With L(1/2) Penalty for Emotion Recognition in Electroencephalography Classification |
title | Sparse Logistic Regression With L(1/2) Penalty for Emotion Recognition in Electroencephalography Classification |
title_full | Sparse Logistic Regression With L(1/2) Penalty for Emotion Recognition in Electroencephalography Classification |
title_fullStr | Sparse Logistic Regression With L(1/2) Penalty for Emotion Recognition in Electroencephalography Classification |
title_full_unstemmed | Sparse Logistic Regression With L(1/2) Penalty for Emotion Recognition in Electroencephalography Classification |
title_short | Sparse Logistic Regression With L(1/2) Penalty for Emotion Recognition in Electroencephalography Classification |
title_sort | sparse logistic regression with l(1/2) penalty for emotion recognition in electroencephalography classification |
topic | Neuroscience |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7427509/ https://www.ncbi.nlm.nih.gov/pubmed/32848688 http://dx.doi.org/10.3389/fninf.2020.00029 |
work_keys_str_mv | AT chendongwei sparselogisticregressionwithl12penaltyforemotionrecognitioninelectroencephalographyclassification AT miaorui sparselogisticregressionwithl12penaltyforemotionrecognitioninelectroencephalographyclassification AT dengzhaoyong sparselogisticregressionwithl12penaltyforemotionrecognitioninelectroencephalographyclassification AT luyueyue sparselogisticregressionwithl12penaltyforemotionrecognitioninelectroencephalographyclassification AT liangyong sparselogisticregressionwithl12penaltyforemotionrecognitioninelectroencephalographyclassification AT huanglan sparselogisticregressionwithl12penaltyforemotionrecognitioninelectroencephalographyclassification |