Cargando…

K-EmoCon, a multimodal sensor dataset for continuous emotion recognition in naturalistic conversations

Recognizing emotions during social interactions has many potential applications with the popularization of low-cost mobile sensors, but a challenge remains with the lack of naturalistic affective interaction data. Most existing emotion datasets do not support studying idiosyncratic emotions arising...

Descripción completa

Detalles Bibliográficos
Autores principales: Park, Cheul Young, Cha, Narae, Kang, Soowon, Kim, Auk, Khandoker, Ahsan Habib, Hadjileontiadis, Leontios, Oh, Alice, Jeong, Yong, Lee, Uichin
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Nature Publishing Group UK 2020
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7479607/
https://www.ncbi.nlm.nih.gov/pubmed/32901038
http://dx.doi.org/10.1038/s41597-020-00630-y
_version_ 1783580309753692160
author Park, Cheul Young
Cha, Narae
Kang, Soowon
Kim, Auk
Khandoker, Ahsan Habib
Hadjileontiadis, Leontios
Oh, Alice
Jeong, Yong
Lee, Uichin
author_facet Park, Cheul Young
Cha, Narae
Kang, Soowon
Kim, Auk
Khandoker, Ahsan Habib
Hadjileontiadis, Leontios
Oh, Alice
Jeong, Yong
Lee, Uichin
author_sort Park, Cheul Young
collection PubMed
description Recognizing emotions during social interactions has many potential applications with the popularization of low-cost mobile sensors, but a challenge remains with the lack of naturalistic affective interaction data. Most existing emotion datasets do not support studying idiosyncratic emotions arising in the wild as they were collected in constrained environments. Therefore, studying emotions in the context of social interactions requires a novel dataset, and K-EmoCon is such a multimodal dataset with comprehensive annotations of continuous emotions during naturalistic conversations. The dataset contains multimodal measurements, including audiovisual recordings, EEG, and peripheral physiological signals, acquired with off-the-shelf devices from 16 sessions of approximately 10-minute long paired debates on a social issue. Distinct from previous datasets, it includes emotion annotations from all three available perspectives: self, debate partner, and external observers. Raters annotated emotional displays at intervals of every 5 seconds while viewing the debate footage, in terms of arousal-valence and 18 additional categorical emotions. The resulting K-EmoCon is the first publicly available emotion dataset accommodating the multiperspective assessment of emotions during social interactions.
format Online
Article
Text
id pubmed-7479607
institution National Center for Biotechnology Information
language English
publishDate 2020
publisher Nature Publishing Group UK
record_format MEDLINE/PubMed
spelling pubmed-74796072020-09-21 K-EmoCon, a multimodal sensor dataset for continuous emotion recognition in naturalistic conversations Park, Cheul Young Cha, Narae Kang, Soowon Kim, Auk Khandoker, Ahsan Habib Hadjileontiadis, Leontios Oh, Alice Jeong, Yong Lee, Uichin Sci Data Data Descriptor Recognizing emotions during social interactions has many potential applications with the popularization of low-cost mobile sensors, but a challenge remains with the lack of naturalistic affective interaction data. Most existing emotion datasets do not support studying idiosyncratic emotions arising in the wild as they were collected in constrained environments. Therefore, studying emotions in the context of social interactions requires a novel dataset, and K-EmoCon is such a multimodal dataset with comprehensive annotations of continuous emotions during naturalistic conversations. The dataset contains multimodal measurements, including audiovisual recordings, EEG, and peripheral physiological signals, acquired with off-the-shelf devices from 16 sessions of approximately 10-minute long paired debates on a social issue. Distinct from previous datasets, it includes emotion annotations from all three available perspectives: self, debate partner, and external observers. Raters annotated emotional displays at intervals of every 5 seconds while viewing the debate footage, in terms of arousal-valence and 18 additional categorical emotions. The resulting K-EmoCon is the first publicly available emotion dataset accommodating the multiperspective assessment of emotions during social interactions. Nature Publishing Group UK 2020-09-08 /pmc/articles/PMC7479607/ /pubmed/32901038 http://dx.doi.org/10.1038/s41597-020-00630-y Text en © The Author(s) 2020 Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver http://creativecommons.org/publicdomain/zero/1.0/ applies to the metadata files associated with this article.
spellingShingle Data Descriptor
Park, Cheul Young
Cha, Narae
Kang, Soowon
Kim, Auk
Khandoker, Ahsan Habib
Hadjileontiadis, Leontios
Oh, Alice
Jeong, Yong
Lee, Uichin
K-EmoCon, a multimodal sensor dataset for continuous emotion recognition in naturalistic conversations
title K-EmoCon, a multimodal sensor dataset for continuous emotion recognition in naturalistic conversations
title_full K-EmoCon, a multimodal sensor dataset for continuous emotion recognition in naturalistic conversations
title_fullStr K-EmoCon, a multimodal sensor dataset for continuous emotion recognition in naturalistic conversations
title_full_unstemmed K-EmoCon, a multimodal sensor dataset for continuous emotion recognition in naturalistic conversations
title_short K-EmoCon, a multimodal sensor dataset for continuous emotion recognition in naturalistic conversations
title_sort k-emocon, a multimodal sensor dataset for continuous emotion recognition in naturalistic conversations
topic Data Descriptor
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7479607/
https://www.ncbi.nlm.nih.gov/pubmed/32901038
http://dx.doi.org/10.1038/s41597-020-00630-y
work_keys_str_mv AT parkcheulyoung kemoconamultimodalsensordatasetforcontinuousemotionrecognitioninnaturalisticconversations
AT chanarae kemoconamultimodalsensordatasetforcontinuousemotionrecognitioninnaturalisticconversations
AT kangsoowon kemoconamultimodalsensordatasetforcontinuousemotionrecognitioninnaturalisticconversations
AT kimauk kemoconamultimodalsensordatasetforcontinuousemotionrecognitioninnaturalisticconversations
AT khandokerahsanhabib kemoconamultimodalsensordatasetforcontinuousemotionrecognitioninnaturalisticconversations
AT hadjileontiadisleontios kemoconamultimodalsensordatasetforcontinuousemotionrecognitioninnaturalisticconversations
AT ohalice kemoconamultimodalsensordatasetforcontinuousemotionrecognitioninnaturalisticconversations
AT jeongyong kemoconamultimodalsensordatasetforcontinuousemotionrecognitioninnaturalisticconversations
AT leeuichin kemoconamultimodalsensordatasetforcontinuousemotionrecognitioninnaturalisticconversations