Cargando…

Fusion of electroencephalographic dynamics and musical contents for estimating emotional responses in music listening

Electroencephalography (EEG)-based emotion classification during music listening has gained increasing attention nowadays due to its promise of potential applications such as musical affective brain-computer interface (ABCI), neuromarketing, music therapy, and implicit multimedia tagging and trigger...

Descripción completa

Detalles Bibliográficos
Autores principales: Lin, Yuan-Pin, Yang, Yi-Hsuan, Jung, Tzyy-Ping
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Frontiers Media S.A. 2014
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4013455/
https://www.ncbi.nlm.nih.gov/pubmed/24822035
http://dx.doi.org/10.3389/fnins.2014.00094
_version_ 1782315055263514624
author Lin, Yuan-Pin
Yang, Yi-Hsuan
Jung, Tzyy-Ping
author_facet Lin, Yuan-Pin
Yang, Yi-Hsuan
Jung, Tzyy-Ping
author_sort Lin, Yuan-Pin
collection PubMed
description Electroencephalography (EEG)-based emotion classification during music listening has gained increasing attention nowadays due to its promise of potential applications such as musical affective brain-computer interface (ABCI), neuromarketing, music therapy, and implicit multimedia tagging and triggering. However, music is an ecologically valid and complex stimulus that conveys certain emotions to listeners through compositions of musical elements. Using solely EEG signals to distinguish emotions remained challenging. This study aimed to assess the applicability of a multimodal approach by leveraging the EEG dynamics and acoustic characteristics of musical contents for the classification of emotional valence and arousal. To this end, this study adopted machine-learning methods to systematically elucidate the roles of the EEG and music modalities in the emotion modeling. The empirical results suggested that when whole-head EEG signals were available, the inclusion of musical contents did not improve the classification performance. The obtained performance of 74~76% using solely EEG modality was statistically comparable to that using the multimodality approach. However, if EEG dynamics were only available from a small set of electrodes (likely the case in real-life applications), the music modality would play a complementary role and augment the EEG results from around 61–67% in valence classification and from around 58–67% in arousal classification. The musical timber appeared to replace less-discriminative EEG features and led to improvements in both valence and arousal classification, whereas musical loudness was contributed specifically to the arousal classification. The present study not only provided principles for constructing an EEG-based multimodal approach, but also revealed the fundamental insights into the interplay of the brain activity and musical contents in emotion modeling.
format Online
Article
Text
id pubmed-4013455
institution National Center for Biotechnology Information
language English
publishDate 2014
publisher Frontiers Media S.A.
record_format MEDLINE/PubMed
spelling pubmed-40134552014-05-12 Fusion of electroencephalographic dynamics and musical contents for estimating emotional responses in music listening Lin, Yuan-Pin Yang, Yi-Hsuan Jung, Tzyy-Ping Front Neurosci Neuroscience Electroencephalography (EEG)-based emotion classification during music listening has gained increasing attention nowadays due to its promise of potential applications such as musical affective brain-computer interface (ABCI), neuromarketing, music therapy, and implicit multimedia tagging and triggering. However, music is an ecologically valid and complex stimulus that conveys certain emotions to listeners through compositions of musical elements. Using solely EEG signals to distinguish emotions remained challenging. This study aimed to assess the applicability of a multimodal approach by leveraging the EEG dynamics and acoustic characteristics of musical contents for the classification of emotional valence and arousal. To this end, this study adopted machine-learning methods to systematically elucidate the roles of the EEG and music modalities in the emotion modeling. The empirical results suggested that when whole-head EEG signals were available, the inclusion of musical contents did not improve the classification performance. The obtained performance of 74~76% using solely EEG modality was statistically comparable to that using the multimodality approach. However, if EEG dynamics were only available from a small set of electrodes (likely the case in real-life applications), the music modality would play a complementary role and augment the EEG results from around 61–67% in valence classification and from around 58–67% in arousal classification. The musical timber appeared to replace less-discriminative EEG features and led to improvements in both valence and arousal classification, whereas musical loudness was contributed specifically to the arousal classification. The present study not only provided principles for constructing an EEG-based multimodal approach, but also revealed the fundamental insights into the interplay of the brain activity and musical contents in emotion modeling. Frontiers Media S.A. 2014-05-01 /pmc/articles/PMC4013455/ /pubmed/24822035 http://dx.doi.org/10.3389/fnins.2014.00094 Text en Copyright © 2014 Lin, Yang and Jung. http://creativecommons.org/licenses/by/3.0/ This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
spellingShingle Neuroscience
Lin, Yuan-Pin
Yang, Yi-Hsuan
Jung, Tzyy-Ping
Fusion of electroencephalographic dynamics and musical contents for estimating emotional responses in music listening
title Fusion of electroencephalographic dynamics and musical contents for estimating emotional responses in music listening
title_full Fusion of electroencephalographic dynamics and musical contents for estimating emotional responses in music listening
title_fullStr Fusion of electroencephalographic dynamics and musical contents for estimating emotional responses in music listening
title_full_unstemmed Fusion of electroencephalographic dynamics and musical contents for estimating emotional responses in music listening
title_short Fusion of electroencephalographic dynamics and musical contents for estimating emotional responses in music listening
title_sort fusion of electroencephalographic dynamics and musical contents for estimating emotional responses in music listening
topic Neuroscience
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4013455/
https://www.ncbi.nlm.nih.gov/pubmed/24822035
http://dx.doi.org/10.3389/fnins.2014.00094
work_keys_str_mv AT linyuanpin fusionofelectroencephalographicdynamicsandmusicalcontentsforestimatingemotionalresponsesinmusiclistening
AT yangyihsuan fusionofelectroencephalographicdynamicsandmusicalcontentsforestimatingemotionalresponsesinmusiclistening
AT jungtzyyping fusionofelectroencephalographicdynamicsandmusicalcontentsforestimatingemotionalresponsesinmusiclistening