Cargando…
Evaluation of electrocardiogram: numerical vs. image data for emotion recognition system
Background: The electrocardiogram (ECG) is a physiological signal used to diagnose and monitor cardiovascular disease, usually using 2- D ECG. Numerous studies have proven that ECG can be used to detect human emotions using 1-D ECG; however, ECG is typically captured as 2-D images rather than as 1-D...
Autores principales: | , , , , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
F1000 Research Limited
2022
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9171287/ https://www.ncbi.nlm.nih.gov/pubmed/35685688 http://dx.doi.org/10.12688/f1000research.73255.2 |
_version_ | 1784721631974785024 |
---|---|
author | Sayed Ismail, Sharifah Noor Masidayu Ab. Aziz, Nor Azlina Ibrahim, Siti Zainab Nawawi, Sophan Wahyudi Alelyani, Salem Mohana, Mohamed Chia Chun, Lee |
author_facet | Sayed Ismail, Sharifah Noor Masidayu Ab. Aziz, Nor Azlina Ibrahim, Siti Zainab Nawawi, Sophan Wahyudi Alelyani, Salem Mohana, Mohamed Chia Chun, Lee |
author_sort | Sayed Ismail, Sharifah Noor Masidayu |
collection | PubMed |
description | Background: The electrocardiogram (ECG) is a physiological signal used to diagnose and monitor cardiovascular disease, usually using 2- D ECG. Numerous studies have proven that ECG can be used to detect human emotions using 1-D ECG; however, ECG is typically captured as 2-D images rather than as 1-D data. There is still no consensus on the effect of the ECG input format on the accuracy of the emotion recognition system (ERS). The ERS using 2-D ECG is still inadequately studied. Therefore, this study compared ERS performance using 1-D and 2-D ECG data to investigate the effect of the ECG input format on the ERS. Methods: This study employed the DREAMER dataset, which contains 23 ECG recordings obtained during audio-visual emotional elicitation. Numerical data was converted to ECG images for the comparison. Numerous approaches were used to obtain ECG features. The Augsburg BioSignal Toolbox (AUBT) and the Toolbox for Emotional feature extraction from Physiological signals (TEAP) extracted features from numerical data. Meanwhile, features were extracted from image data using Oriented FAST and rotated BRIEF (ORB), Scale Invariant Feature Transform (SIFT), KAZE, Accelerated-KAZE (AKAZE), Binary Robust Invariant Scalable Keypoints (BRISK), and Histogram of Oriented Gradients (HOG). Dimension reduction was accomplished using linear discriminant analysis (LDA), and valence and arousal were classified using the Support Vector Machine (SVM). Results: The experimental results show 1-D ECG-based ERS achieved 65.06% of accuracy and 75.63% of F1 score for valence, and 57.83% of accuracy and 44.44% of F1-score for arousal. For 2-D ECG-based ERS, the highest accuracy and F1-score for valence were 62.35% and 49.57%; whereas, the arousal was 59.64% and 59.71%. Conclusions: The results indicate that both inputs work comparably well in classifying emotions, which demonstrates the potential of 1-D and 2-D as input modalities for the ERS. |
format | Online Article Text |
id | pubmed-9171287 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2022 |
publisher | F1000 Research Limited |
record_format | MEDLINE/PubMed |
spelling | pubmed-91712872022-06-08 Evaluation of electrocardiogram: numerical vs. image data for emotion recognition system Sayed Ismail, Sharifah Noor Masidayu Ab. Aziz, Nor Azlina Ibrahim, Siti Zainab Nawawi, Sophan Wahyudi Alelyani, Salem Mohana, Mohamed Chia Chun, Lee F1000Res Research Article Background: The electrocardiogram (ECG) is a physiological signal used to diagnose and monitor cardiovascular disease, usually using 2- D ECG. Numerous studies have proven that ECG can be used to detect human emotions using 1-D ECG; however, ECG is typically captured as 2-D images rather than as 1-D data. There is still no consensus on the effect of the ECG input format on the accuracy of the emotion recognition system (ERS). The ERS using 2-D ECG is still inadequately studied. Therefore, this study compared ERS performance using 1-D and 2-D ECG data to investigate the effect of the ECG input format on the ERS. Methods: This study employed the DREAMER dataset, which contains 23 ECG recordings obtained during audio-visual emotional elicitation. Numerical data was converted to ECG images for the comparison. Numerous approaches were used to obtain ECG features. The Augsburg BioSignal Toolbox (AUBT) and the Toolbox for Emotional feature extraction from Physiological signals (TEAP) extracted features from numerical data. Meanwhile, features were extracted from image data using Oriented FAST and rotated BRIEF (ORB), Scale Invariant Feature Transform (SIFT), KAZE, Accelerated-KAZE (AKAZE), Binary Robust Invariant Scalable Keypoints (BRISK), and Histogram of Oriented Gradients (HOG). Dimension reduction was accomplished using linear discriminant analysis (LDA), and valence and arousal were classified using the Support Vector Machine (SVM). Results: The experimental results show 1-D ECG-based ERS achieved 65.06% of accuracy and 75.63% of F1 score for valence, and 57.83% of accuracy and 44.44% of F1-score for arousal. For 2-D ECG-based ERS, the highest accuracy and F1-score for valence were 62.35% and 49.57%; whereas, the arousal was 59.64% and 59.71%. Conclusions: The results indicate that both inputs work comparably well in classifying emotions, which demonstrates the potential of 1-D and 2-D as input modalities for the ERS. F1000 Research Limited 2022-05-30 /pmc/articles/PMC9171287/ /pubmed/35685688 http://dx.doi.org/10.12688/f1000research.73255.2 Text en Copyright: © 2022 Sayed Ismail SNM et al. https://creativecommons.org/licenses/by/4.0/This is an open access article distributed under the terms of the Creative Commons Attribution Licence, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. |
spellingShingle | Research Article Sayed Ismail, Sharifah Noor Masidayu Ab. Aziz, Nor Azlina Ibrahim, Siti Zainab Nawawi, Sophan Wahyudi Alelyani, Salem Mohana, Mohamed Chia Chun, Lee Evaluation of electrocardiogram: numerical vs. image data for emotion recognition system |
title | Evaluation of electrocardiogram: numerical vs. image data for emotion recognition system |
title_full | Evaluation of electrocardiogram: numerical vs. image data for emotion recognition system |
title_fullStr | Evaluation of electrocardiogram: numerical vs. image data for emotion recognition system |
title_full_unstemmed | Evaluation of electrocardiogram: numerical vs. image data for emotion recognition system |
title_short | Evaluation of electrocardiogram: numerical vs. image data for emotion recognition system |
title_sort | evaluation of electrocardiogram: numerical vs. image data for emotion recognition system |
topic | Research Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9171287/ https://www.ncbi.nlm.nih.gov/pubmed/35685688 http://dx.doi.org/10.12688/f1000research.73255.2 |
work_keys_str_mv | AT sayedismailsharifahnoormasidayu evaluationofelectrocardiogramnumericalvsimagedataforemotionrecognitionsystem AT abaziznorazlina evaluationofelectrocardiogramnumericalvsimagedataforemotionrecognitionsystem AT ibrahimsitizainab evaluationofelectrocardiogramnumericalvsimagedataforemotionrecognitionsystem AT nawawisophanwahyudi evaluationofelectrocardiogramnumericalvsimagedataforemotionrecognitionsystem AT alelyanisalem evaluationofelectrocardiogramnumericalvsimagedataforemotionrecognitionsystem AT mohanamohamed evaluationofelectrocardiogramnumericalvsimagedataforemotionrecognitionsystem AT chiachunlee evaluationofelectrocardiogramnumericalvsimagedataforemotionrecognitionsystem |