Cargando…
An EEG Neurofeedback Interactive Model for Emotional Classification of Electronic Music Compositions Considering Multi-Brain Synergistic Brain-Computer Interfaces
This paper presents an in-depth study and analysis of the emotional classification of EEG neurofeedback interactive electronic music compositions using a multi-brain collaborative brain-computer interface (BCI). Based on previous research, this paper explores the design and performance of sound visu...
Autor principal: | |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Frontiers Media S.A.
2022
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8764261/ https://www.ncbi.nlm.nih.gov/pubmed/35058858 http://dx.doi.org/10.3389/fpsyg.2021.799132 |
_version_ | 1784634124900761600 |
---|---|
author | Liu, Mingxing |
author_facet | Liu, Mingxing |
author_sort | Liu, Mingxing |
collection | PubMed |
description | This paper presents an in-depth study and analysis of the emotional classification of EEG neurofeedback interactive electronic music compositions using a multi-brain collaborative brain-computer interface (BCI). Based on previous research, this paper explores the design and performance of sound visualization in an interactive format from the perspective of visual performance design and the psychology of participating users with the help of knowledge from various disciplines such as psychology, acoustics, aesthetics, neurophysiology, and computer science. This paper proposes a specific mapping model for the conversion of sound to visual expression based on people’s perception and aesthetics of sound based on the phenomenon of audiovisual association, which provides a theoretical basis for the subsequent research. Based on the mapping transformation pattern between audio and visual, this paper investigates the realization path of interactive sound visualization, the visual expression form and its formal composition, and the aesthetic style, and forms a design expression method for the visualization of interactive sound, to benefit the practice of interactive sound visualization. In response to the problem of neglecting the real-time and dynamic nature of the brain in traditional brain network research, dynamic brain networks proposed for analyzing the EEG signals induced by long-time music appreciation. During prolonged music appreciation, the connectivity of the brain changes continuously. We used mutual information on different frequency bands of EEG signals to construct dynamic brain networks, observe changes in brain networks over time and use them for emotion recognition. We used the brain network for emotion classification and achieved an emotion recognition rate of 67.3% under four classifications, exceeding the highest recognition rate available. |
format | Online Article Text |
id | pubmed-8764261 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2022 |
publisher | Frontiers Media S.A. |
record_format | MEDLINE/PubMed |
spelling | pubmed-87642612022-01-19 An EEG Neurofeedback Interactive Model for Emotional Classification of Electronic Music Compositions Considering Multi-Brain Synergistic Brain-Computer Interfaces Liu, Mingxing Front Psychol Psychology This paper presents an in-depth study and analysis of the emotional classification of EEG neurofeedback interactive electronic music compositions using a multi-brain collaborative brain-computer interface (BCI). Based on previous research, this paper explores the design and performance of sound visualization in an interactive format from the perspective of visual performance design and the psychology of participating users with the help of knowledge from various disciplines such as psychology, acoustics, aesthetics, neurophysiology, and computer science. This paper proposes a specific mapping model for the conversion of sound to visual expression based on people’s perception and aesthetics of sound based on the phenomenon of audiovisual association, which provides a theoretical basis for the subsequent research. Based on the mapping transformation pattern between audio and visual, this paper investigates the realization path of interactive sound visualization, the visual expression form and its formal composition, and the aesthetic style, and forms a design expression method for the visualization of interactive sound, to benefit the practice of interactive sound visualization. In response to the problem of neglecting the real-time and dynamic nature of the brain in traditional brain network research, dynamic brain networks proposed for analyzing the EEG signals induced by long-time music appreciation. During prolonged music appreciation, the connectivity of the brain changes continuously. We used mutual information on different frequency bands of EEG signals to construct dynamic brain networks, observe changes in brain networks over time and use them for emotion recognition. We used the brain network for emotion classification and achieved an emotion recognition rate of 67.3% under four classifications, exceeding the highest recognition rate available. Frontiers Media S.A. 2022-01-04 /pmc/articles/PMC8764261/ /pubmed/35058858 http://dx.doi.org/10.3389/fpsyg.2021.799132 Text en Copyright © 2022 Liu. https://creativecommons.org/licenses/by/4.0/This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms. |
spellingShingle | Psychology Liu, Mingxing An EEG Neurofeedback Interactive Model for Emotional Classification of Electronic Music Compositions Considering Multi-Brain Synergistic Brain-Computer Interfaces |
title | An EEG Neurofeedback Interactive Model for Emotional Classification of Electronic Music Compositions Considering Multi-Brain Synergistic Brain-Computer Interfaces |
title_full | An EEG Neurofeedback Interactive Model for Emotional Classification of Electronic Music Compositions Considering Multi-Brain Synergistic Brain-Computer Interfaces |
title_fullStr | An EEG Neurofeedback Interactive Model for Emotional Classification of Electronic Music Compositions Considering Multi-Brain Synergistic Brain-Computer Interfaces |
title_full_unstemmed | An EEG Neurofeedback Interactive Model for Emotional Classification of Electronic Music Compositions Considering Multi-Brain Synergistic Brain-Computer Interfaces |
title_short | An EEG Neurofeedback Interactive Model for Emotional Classification of Electronic Music Compositions Considering Multi-Brain Synergistic Brain-Computer Interfaces |
title_sort | eeg neurofeedback interactive model for emotional classification of electronic music compositions considering multi-brain synergistic brain-computer interfaces |
topic | Psychology |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8764261/ https://www.ncbi.nlm.nih.gov/pubmed/35058858 http://dx.doi.org/10.3389/fpsyg.2021.799132 |
work_keys_str_mv | AT liumingxing aneegneurofeedbackinteractivemodelforemotionalclassificationofelectronicmusiccompositionsconsideringmultibrainsynergisticbraincomputerinterfaces AT liumingxing eegneurofeedbackinteractivemodelforemotionalclassificationofelectronicmusiccompositionsconsideringmultibrainsynergisticbraincomputerinterfaces |