Cargando…

Impact of Affective Multimedia Content on the Electroencephalogram and Facial Expressions

Most of the research in the field of affective computing has focused on detecting and classifying human emotions through electroencephalogram (EEG) or facial expressions. Designing multimedia content to evoke certain emotions has been largely motivated by manual rating provided by users. Here we pre...

Descripción completa

Detalles Bibliográficos
Autores principales: Siddharth, Siddharth, Jung, Tzyy-Ping, Sejnowski, Terrence J.
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Nature Publishing Group UK 2019
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6841664/
https://www.ncbi.nlm.nih.gov/pubmed/31705031
http://dx.doi.org/10.1038/s41598-019-52891-2
_version_ 1783467938146156544
author Siddharth, Siddharth
Jung, Tzyy-Ping
Sejnowski, Terrence J.
author_facet Siddharth, Siddharth
Jung, Tzyy-Ping
Sejnowski, Terrence J.
author_sort Siddharth, Siddharth
collection PubMed
description Most of the research in the field of affective computing has focused on detecting and classifying human emotions through electroencephalogram (EEG) or facial expressions. Designing multimedia content to evoke certain emotions has been largely motivated by manual rating provided by users. Here we present insights from the correlation of affective features between three modalities namely, affective multimedia content, EEG, and facial expressions. Interestingly, low-level Audio-visual features such as contrast and homogeneity of the video and tone of the audio in the movie clips are most correlated with changes in facial expressions and EEG. We also detect the regions associated with the human face and the brain (in addition to the EEG frequency bands) that are most representative of affective responses. The computational modeling between the three modalities showed a high correlation between features from these regions and user-reported affective labels. Finally, the correlation between different layers of convolutional neural networks with EEG and Face images as input provides insights into human affection. Together, these findings will assist in (1) designing more effective multimedia contents to engage or influence the viewers, (2) understanding the brain/body bio-markers of affection, and (3) developing newer brain-computer interfaces as well as facial-expression-based algorithms to read emotional responses of the viewers.
format Online
Article
Text
id pubmed-6841664
institution National Center for Biotechnology Information
language English
publishDate 2019
publisher Nature Publishing Group UK
record_format MEDLINE/PubMed
spelling pubmed-68416642019-11-14 Impact of Affective Multimedia Content on the Electroencephalogram and Facial Expressions Siddharth, Siddharth Jung, Tzyy-Ping Sejnowski, Terrence J. Sci Rep Article Most of the research in the field of affective computing has focused on detecting and classifying human emotions through electroencephalogram (EEG) or facial expressions. Designing multimedia content to evoke certain emotions has been largely motivated by manual rating provided by users. Here we present insights from the correlation of affective features between three modalities namely, affective multimedia content, EEG, and facial expressions. Interestingly, low-level Audio-visual features such as contrast and homogeneity of the video and tone of the audio in the movie clips are most correlated with changes in facial expressions and EEG. We also detect the regions associated with the human face and the brain (in addition to the EEG frequency bands) that are most representative of affective responses. The computational modeling between the three modalities showed a high correlation between features from these regions and user-reported affective labels. Finally, the correlation between different layers of convolutional neural networks with EEG and Face images as input provides insights into human affection. Together, these findings will assist in (1) designing more effective multimedia contents to engage or influence the viewers, (2) understanding the brain/body bio-markers of affection, and (3) developing newer brain-computer interfaces as well as facial-expression-based algorithms to read emotional responses of the viewers. Nature Publishing Group UK 2019-11-08 /pmc/articles/PMC6841664/ /pubmed/31705031 http://dx.doi.org/10.1038/s41598-019-52891-2 Text en © The Author(s) 2019 Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/.
spellingShingle Article
Siddharth, Siddharth
Jung, Tzyy-Ping
Sejnowski, Terrence J.
Impact of Affective Multimedia Content on the Electroencephalogram and Facial Expressions
title Impact of Affective Multimedia Content on the Electroencephalogram and Facial Expressions
title_full Impact of Affective Multimedia Content on the Electroencephalogram and Facial Expressions
title_fullStr Impact of Affective Multimedia Content on the Electroencephalogram and Facial Expressions
title_full_unstemmed Impact of Affective Multimedia Content on the Electroencephalogram and Facial Expressions
title_short Impact of Affective Multimedia Content on the Electroencephalogram and Facial Expressions
title_sort impact of affective multimedia content on the electroencephalogram and facial expressions
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6841664/
https://www.ncbi.nlm.nih.gov/pubmed/31705031
http://dx.doi.org/10.1038/s41598-019-52891-2
work_keys_str_mv AT siddharthsiddharth impactofaffectivemultimediacontentontheelectroencephalogramandfacialexpressions
AT jungtzyyping impactofaffectivemultimediacontentontheelectroencephalogramandfacialexpressions
AT sejnowskiterrencej impactofaffectivemultimediacontentontheelectroencephalogramandfacialexpressions