Cargando…

The uulmMAC Database—A Multimodal Affective Corpus for Affective Computing in Human-Computer Interaction

In this paper, we present a multimodal dataset for affective computing research acquired in a human-computer interaction (HCI) setting. An experimental mobile and interactive scenario was designed and implemented based on a gamified generic paradigm for the induction of dialog-based HCI relevant emo...

Descripción completa

Detalles Bibliográficos
Autores principales: Hazer-Rau, Dilana, Meudt, Sascha, Daucher, Andreas, Spohrs, Jennifer, Hoffmann, Holger, Schwenker, Friedhelm, Traue, Harald C.
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2020
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7219061/
https://www.ncbi.nlm.nih.gov/pubmed/32316626
http://dx.doi.org/10.3390/s20082308
_version_ 1783532920981422080
author Hazer-Rau, Dilana
Meudt, Sascha
Daucher, Andreas
Spohrs, Jennifer
Hoffmann, Holger
Schwenker, Friedhelm
Traue, Harald C.
author_facet Hazer-Rau, Dilana
Meudt, Sascha
Daucher, Andreas
Spohrs, Jennifer
Hoffmann, Holger
Schwenker, Friedhelm
Traue, Harald C.
author_sort Hazer-Rau, Dilana
collection PubMed
description In this paper, we present a multimodal dataset for affective computing research acquired in a human-computer interaction (HCI) setting. An experimental mobile and interactive scenario was designed and implemented based on a gamified generic paradigm for the induction of dialog-based HCI relevant emotional and cognitive load states. It consists of six experimental sequences, inducing Interest, Overload, Normal, Easy, Underload, and Frustration. Each sequence is followed by subjective feedbacks to validate the induction, a respiration baseline to level off the physiological reactions, and a summary of results. Further, prior to the experiment, three questionnaires related to emotion regulation (ERQ), emotional control (TEIQue-SF), and personality traits (TIPI) were collected from each subject to evaluate the stability of the induction paradigm. Based on this HCI scenario, the University of Ulm Multimodal Affective Corpus (uulmMAC), consisting of two homogenous samples of 60 participants and 100 recording sessions was generated. We recorded 16 sensor modalities including 4 × video, 3 × audio, and 7 × biophysiological, depth, and pose streams. Further, additional labels and annotations were also collected. After recording, all data were post-processed and checked for technical and signal quality, resulting in the final uulmMAC dataset of 57 subjects and 95 recording sessions. The evaluation of the reported subjective feedbacks shows significant differences between the sequences, well consistent with the induced states, and the analysis of the questionnaires shows stable results. In summary, our uulmMAC database is a valuable contribution for the field of affective computing and multimodal data analysis: Acquired in a mobile interactive scenario close to real HCI, it consists of a large number of subjects and allows transtemporal investigations. Validated via subjective feedbacks and checked for quality issues, it can be used for affective computing and machine learning applications.
format Online
Article
Text
id pubmed-7219061
institution National Center for Biotechnology Information
language English
publishDate 2020
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-72190612020-05-22 The uulmMAC Database—A Multimodal Affective Corpus for Affective Computing in Human-Computer Interaction Hazer-Rau, Dilana Meudt, Sascha Daucher, Andreas Spohrs, Jennifer Hoffmann, Holger Schwenker, Friedhelm Traue, Harald C. Sensors (Basel) Article In this paper, we present a multimodal dataset for affective computing research acquired in a human-computer interaction (HCI) setting. An experimental mobile and interactive scenario was designed and implemented based on a gamified generic paradigm for the induction of dialog-based HCI relevant emotional and cognitive load states. It consists of six experimental sequences, inducing Interest, Overload, Normal, Easy, Underload, and Frustration. Each sequence is followed by subjective feedbacks to validate the induction, a respiration baseline to level off the physiological reactions, and a summary of results. Further, prior to the experiment, three questionnaires related to emotion regulation (ERQ), emotional control (TEIQue-SF), and personality traits (TIPI) were collected from each subject to evaluate the stability of the induction paradigm. Based on this HCI scenario, the University of Ulm Multimodal Affective Corpus (uulmMAC), consisting of two homogenous samples of 60 participants and 100 recording sessions was generated. We recorded 16 sensor modalities including 4 × video, 3 × audio, and 7 × biophysiological, depth, and pose streams. Further, additional labels and annotations were also collected. After recording, all data were post-processed and checked for technical and signal quality, resulting in the final uulmMAC dataset of 57 subjects and 95 recording sessions. The evaluation of the reported subjective feedbacks shows significant differences between the sequences, well consistent with the induced states, and the analysis of the questionnaires shows stable results. In summary, our uulmMAC database is a valuable contribution for the field of affective computing and multimodal data analysis: Acquired in a mobile interactive scenario close to real HCI, it consists of a large number of subjects and allows transtemporal investigations. Validated via subjective feedbacks and checked for quality issues, it can be used for affective computing and machine learning applications. MDPI 2020-04-17 /pmc/articles/PMC7219061/ /pubmed/32316626 http://dx.doi.org/10.3390/s20082308 Text en © 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
spellingShingle Article
Hazer-Rau, Dilana
Meudt, Sascha
Daucher, Andreas
Spohrs, Jennifer
Hoffmann, Holger
Schwenker, Friedhelm
Traue, Harald C.
The uulmMAC Database—A Multimodal Affective Corpus for Affective Computing in Human-Computer Interaction
title The uulmMAC Database—A Multimodal Affective Corpus for Affective Computing in Human-Computer Interaction
title_full The uulmMAC Database—A Multimodal Affective Corpus for Affective Computing in Human-Computer Interaction
title_fullStr The uulmMAC Database—A Multimodal Affective Corpus for Affective Computing in Human-Computer Interaction
title_full_unstemmed The uulmMAC Database—A Multimodal Affective Corpus for Affective Computing in Human-Computer Interaction
title_short The uulmMAC Database—A Multimodal Affective Corpus for Affective Computing in Human-Computer Interaction
title_sort uulmmac database—a multimodal affective corpus for affective computing in human-computer interaction
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7219061/
https://www.ncbi.nlm.nih.gov/pubmed/32316626
http://dx.doi.org/10.3390/s20082308
work_keys_str_mv AT hazerraudilana theuulmmacdatabaseamultimodalaffectivecorpusforaffectivecomputinginhumancomputerinteraction
AT meudtsascha theuulmmacdatabaseamultimodalaffectivecorpusforaffectivecomputinginhumancomputerinteraction
AT daucherandreas theuulmmacdatabaseamultimodalaffectivecorpusforaffectivecomputinginhumancomputerinteraction
AT spohrsjennifer theuulmmacdatabaseamultimodalaffectivecorpusforaffectivecomputinginhumancomputerinteraction
AT hoffmannholger theuulmmacdatabaseamultimodalaffectivecorpusforaffectivecomputinginhumancomputerinteraction
AT schwenkerfriedhelm theuulmmacdatabaseamultimodalaffectivecorpusforaffectivecomputinginhumancomputerinteraction
AT traueharaldc theuulmmacdatabaseamultimodalaffectivecorpusforaffectivecomputinginhumancomputerinteraction
AT hazerraudilana uulmmacdatabaseamultimodalaffectivecorpusforaffectivecomputinginhumancomputerinteraction
AT meudtsascha uulmmacdatabaseamultimodalaffectivecorpusforaffectivecomputinginhumancomputerinteraction
AT daucherandreas uulmmacdatabaseamultimodalaffectivecorpusforaffectivecomputinginhumancomputerinteraction
AT spohrsjennifer uulmmacdatabaseamultimodalaffectivecorpusforaffectivecomputinginhumancomputerinteraction
AT hoffmannholger uulmmacdatabaseamultimodalaffectivecorpusforaffectivecomputinginhumancomputerinteraction
AT schwenkerfriedhelm uulmmacdatabaseamultimodalaffectivecorpusforaffectivecomputinginhumancomputerinteraction
AT traueharaldc uulmmacdatabaseamultimodalaffectivecorpusforaffectivecomputinginhumancomputerinteraction