Cargando…
Introducing ACASS: An Annotated Character Animation Stimulus Set for Controlled (e)Motion Perception Studies
Others' movements inform us about their current activities as well as their intentions and emotions. Research on the distinct mechanisms underlying action recognition and emotion inferences has been limited due to a lack of suitable comparative stimulus material. Problematic confounds can deriv...
Autores principales: | , , , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Frontiers Media S.A.
2019
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7805965/ https://www.ncbi.nlm.nih.gov/pubmed/33501109 http://dx.doi.org/10.3389/frobt.2019.00094 |
_version_ | 1783636423622000640 |
---|---|
author | Lammers, Sebastian Bente, Gary Tepest, Ralf Jording, Mathis Roth, Daniel Vogeley, Kai |
author_facet | Lammers, Sebastian Bente, Gary Tepest, Ralf Jording, Mathis Roth, Daniel Vogeley, Kai |
author_sort | Lammers, Sebastian |
collection | PubMed |
description | Others' movements inform us about their current activities as well as their intentions and emotions. Research on the distinct mechanisms underlying action recognition and emotion inferences has been limited due to a lack of suitable comparative stimulus material. Problematic confounds can derive from low-level physical features (e.g., luminance), as well as from higher-level psychological features (e.g., stimulus difficulty). Here we present a standardized stimulus dataset, which allows to address both action and emotion recognition with identical stimuli. The stimulus set consists of 792 computer animations with a neutral avatar based on full body motion capture protocols. Motion capture was performed on 22 human volunteers, instructed to perform six everyday activities (mopping, sweeping, painting with a roller, painting with a brush, wiping, sanding) in three different moods (angry, happy, sad). Five-second clips of each motion protocol were rendered into AVI-files using two virtual camera perspectives for each clip. In contrast to video stimuli, the computer animations allowed to standardize the physical appearance of the avatar and to control lighting and coloring conditions, thus reducing the stimulus variation to mere movement. To control for low level optical features of the stimuli, we developed and applied a set of MATLAB routines extracting basic physical features of the stimuli, including average background-foreground proportion and frame-by-frame pixel change dynamics. This information was used to identify outliers and to homogenize the stimuli across action and emotion categories. This led to a smaller stimulus subset (n = 83 animations within the 792 clip database) which only contained two different actions (mopping, sweeping) and two different moods (angry, happy). To further homogenize this stimulus subset with regard to psychological criteria we conducted an online observer study (N = 112 participants) to assess the recognition rates for actions and moods, which led to a final sub-selection of 32 clips (eight per category) within the database. The ACASS database and its subsets provide unique opportunities for research applications in social psychology, social neuroscience, and applied clinical studies on communication disorders. All 792 AVI-files, selected subsets, MATLAB code, annotations, and motion capture data (FBX-files) are available online. |
format | Online Article Text |
id | pubmed-7805965 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2019 |
publisher | Frontiers Media S.A. |
record_format | MEDLINE/PubMed |
spelling | pubmed-78059652021-01-25 Introducing ACASS: An Annotated Character Animation Stimulus Set for Controlled (e)Motion Perception Studies Lammers, Sebastian Bente, Gary Tepest, Ralf Jording, Mathis Roth, Daniel Vogeley, Kai Front Robot AI Robotics and AI Others' movements inform us about their current activities as well as their intentions and emotions. Research on the distinct mechanisms underlying action recognition and emotion inferences has been limited due to a lack of suitable comparative stimulus material. Problematic confounds can derive from low-level physical features (e.g., luminance), as well as from higher-level psychological features (e.g., stimulus difficulty). Here we present a standardized stimulus dataset, which allows to address both action and emotion recognition with identical stimuli. The stimulus set consists of 792 computer animations with a neutral avatar based on full body motion capture protocols. Motion capture was performed on 22 human volunteers, instructed to perform six everyday activities (mopping, sweeping, painting with a roller, painting with a brush, wiping, sanding) in three different moods (angry, happy, sad). Five-second clips of each motion protocol were rendered into AVI-files using two virtual camera perspectives for each clip. In contrast to video stimuli, the computer animations allowed to standardize the physical appearance of the avatar and to control lighting and coloring conditions, thus reducing the stimulus variation to mere movement. To control for low level optical features of the stimuli, we developed and applied a set of MATLAB routines extracting basic physical features of the stimuli, including average background-foreground proportion and frame-by-frame pixel change dynamics. This information was used to identify outliers and to homogenize the stimuli across action and emotion categories. This led to a smaller stimulus subset (n = 83 animations within the 792 clip database) which only contained two different actions (mopping, sweeping) and two different moods (angry, happy). To further homogenize this stimulus subset with regard to psychological criteria we conducted an online observer study (N = 112 participants) to assess the recognition rates for actions and moods, which led to a final sub-selection of 32 clips (eight per category) within the database. The ACASS database and its subsets provide unique opportunities for research applications in social psychology, social neuroscience, and applied clinical studies on communication disorders. All 792 AVI-files, selected subsets, MATLAB code, annotations, and motion capture data (FBX-files) are available online. Frontiers Media S.A. 2019-09-27 /pmc/articles/PMC7805965/ /pubmed/33501109 http://dx.doi.org/10.3389/frobt.2019.00094 Text en Copyright © 2019 Lammers, Bente, Tepest, Jording, Roth and Vogeley. http://creativecommons.org/licenses/by/4.0/ This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms. |
spellingShingle | Robotics and AI Lammers, Sebastian Bente, Gary Tepest, Ralf Jording, Mathis Roth, Daniel Vogeley, Kai Introducing ACASS: An Annotated Character Animation Stimulus Set for Controlled (e)Motion Perception Studies |
title | Introducing ACASS: An Annotated Character Animation Stimulus Set for Controlled (e)Motion Perception Studies |
title_full | Introducing ACASS: An Annotated Character Animation Stimulus Set for Controlled (e)Motion Perception Studies |
title_fullStr | Introducing ACASS: An Annotated Character Animation Stimulus Set for Controlled (e)Motion Perception Studies |
title_full_unstemmed | Introducing ACASS: An Annotated Character Animation Stimulus Set for Controlled (e)Motion Perception Studies |
title_short | Introducing ACASS: An Annotated Character Animation Stimulus Set for Controlled (e)Motion Perception Studies |
title_sort | introducing acass: an annotated character animation stimulus set for controlled (e)motion perception studies |
topic | Robotics and AI |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7805965/ https://www.ncbi.nlm.nih.gov/pubmed/33501109 http://dx.doi.org/10.3389/frobt.2019.00094 |
work_keys_str_mv | AT lammerssebastian introducingacassanannotatedcharacteranimationstimulussetforcontrolledemotionperceptionstudies AT bentegary introducingacassanannotatedcharacteranimationstimulussetforcontrolledemotionperceptionstudies AT tepestralf introducingacassanannotatedcharacteranimationstimulussetforcontrolledemotionperceptionstudies AT jordingmathis introducingacassanannotatedcharacteranimationstimulussetforcontrolledemotionperceptionstudies AT rothdaniel introducingacassanannotatedcharacteranimationstimulussetforcontrolledemotionperceptionstudies AT vogeleykai introducingacassanannotatedcharacteranimationstimulussetforcontrolledemotionperceptionstudies |