Cargando…
Emognition dataset: emotion recognition with self-reports, facial expressions, and physiology using wearables
The Emognition dataset is dedicated to testing methods for emotion recognition (ER) from physiological responses and facial expressions. We collected data from 43 participants who watched short film clips eliciting nine discrete emotions: amusement, awe, enthusiasm, liking, surprise, anger, disgust,...
Autores principales: | , , , , , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Nature Publishing Group UK
2022
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8989970/ https://www.ncbi.nlm.nih.gov/pubmed/35393434 http://dx.doi.org/10.1038/s41597-022-01262-0 |
_version_ | 1784683286716481536 |
---|---|
author | Saganowski, Stanisław Komoszyńska, Joanna Behnke, Maciej Perz, Bartosz Kunc, Dominika Klich, Bartłomiej Kaczmarek, Łukasz D. Kazienko, Przemysław |
author_facet | Saganowski, Stanisław Komoszyńska, Joanna Behnke, Maciej Perz, Bartosz Kunc, Dominika Klich, Bartłomiej Kaczmarek, Łukasz D. Kazienko, Przemysław |
author_sort | Saganowski, Stanisław |
collection | PubMed |
description | The Emognition dataset is dedicated to testing methods for emotion recognition (ER) from physiological responses and facial expressions. We collected data from 43 participants who watched short film clips eliciting nine discrete emotions: amusement, awe, enthusiasm, liking, surprise, anger, disgust, fear, and sadness. Three wearables were used to record physiological data: EEG, BVP (2x), HR, EDA, SKT, ACC (3x), and GYRO (2x); in parallel with the upper-body videos. After each film clip, participants completed two types of self-reports: (1) related to nine discrete emotions and (2) three affective dimensions: valence, arousal, and motivation. The obtained data facilitates various ER approaches, e.g., multimodal ER, EEG- vs. cardiovascular-based ER, discrete to dimensional representation transitions. The technical validation indicated that watching film clips elicited the targeted emotions. It also supported signals’ high quality. |
format | Online Article Text |
id | pubmed-8989970 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2022 |
publisher | Nature Publishing Group UK |
record_format | MEDLINE/PubMed |
spelling | pubmed-89899702022-04-22 Emognition dataset: emotion recognition with self-reports, facial expressions, and physiology using wearables Saganowski, Stanisław Komoszyńska, Joanna Behnke, Maciej Perz, Bartosz Kunc, Dominika Klich, Bartłomiej Kaczmarek, Łukasz D. Kazienko, Przemysław Sci Data Data Descriptor The Emognition dataset is dedicated to testing methods for emotion recognition (ER) from physiological responses and facial expressions. We collected data from 43 participants who watched short film clips eliciting nine discrete emotions: amusement, awe, enthusiasm, liking, surprise, anger, disgust, fear, and sadness. Three wearables were used to record physiological data: EEG, BVP (2x), HR, EDA, SKT, ACC (3x), and GYRO (2x); in parallel with the upper-body videos. After each film clip, participants completed two types of self-reports: (1) related to nine discrete emotions and (2) three affective dimensions: valence, arousal, and motivation. The obtained data facilitates various ER approaches, e.g., multimodal ER, EEG- vs. cardiovascular-based ER, discrete to dimensional representation transitions. The technical validation indicated that watching film clips elicited the targeted emotions. It also supported signals’ high quality. Nature Publishing Group UK 2022-04-07 /pmc/articles/PMC8989970/ /pubmed/35393434 http://dx.doi.org/10.1038/s41597-022-01262-0 Text en © The Author(s) 2022 https://creativecommons.org/licenses/by/4.0/Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/ (https://creativecommons.org/licenses/by/4.0/) . |
spellingShingle | Data Descriptor Saganowski, Stanisław Komoszyńska, Joanna Behnke, Maciej Perz, Bartosz Kunc, Dominika Klich, Bartłomiej Kaczmarek, Łukasz D. Kazienko, Przemysław Emognition dataset: emotion recognition with self-reports, facial expressions, and physiology using wearables |
title | Emognition dataset: emotion recognition with self-reports, facial expressions, and physiology using wearables |
title_full | Emognition dataset: emotion recognition with self-reports, facial expressions, and physiology using wearables |
title_fullStr | Emognition dataset: emotion recognition with self-reports, facial expressions, and physiology using wearables |
title_full_unstemmed | Emognition dataset: emotion recognition with self-reports, facial expressions, and physiology using wearables |
title_short | Emognition dataset: emotion recognition with self-reports, facial expressions, and physiology using wearables |
title_sort | emognition dataset: emotion recognition with self-reports, facial expressions, and physiology using wearables |
topic | Data Descriptor |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8989970/ https://www.ncbi.nlm.nih.gov/pubmed/35393434 http://dx.doi.org/10.1038/s41597-022-01262-0 |
work_keys_str_mv | AT saganowskistanisław emognitiondatasetemotionrecognitionwithselfreportsfacialexpressionsandphysiologyusingwearables AT komoszynskajoanna emognitiondatasetemotionrecognitionwithselfreportsfacialexpressionsandphysiologyusingwearables AT behnkemaciej emognitiondatasetemotionrecognitionwithselfreportsfacialexpressionsandphysiologyusingwearables AT perzbartosz emognitiondatasetemotionrecognitionwithselfreportsfacialexpressionsandphysiologyusingwearables AT kuncdominika emognitiondatasetemotionrecognitionwithselfreportsfacialexpressionsandphysiologyusingwearables AT klichbartłomiej emognitiondatasetemotionrecognitionwithselfreportsfacialexpressionsandphysiologyusingwearables AT kaczmarekłukaszd emognitiondatasetemotionrecognitionwithselfreportsfacialexpressionsandphysiologyusingwearables AT kazienkoprzemysław emognitiondatasetemotionrecognitionwithselfreportsfacialexpressionsandphysiologyusingwearables |