Cargando…
Can an android’s posture and movement discriminate against the ambiguous emotion perceived from its facial expressions?
Expressing emotions through various modalities is a crucial function not only for humans but also for robots. The mapping method from facial expressions to the basic emotions is widely used in research on robot emotional expressions. This method claims that there are specific facial muscle activatio...
Autores principales: | Yagi, Satoshi, Nakata, Yoshihiro, Nakamura, Yutaka, Ishiguro, Hiroshi |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Public Library of Science
2021
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8354482/ https://www.ncbi.nlm.nih.gov/pubmed/34375327 http://dx.doi.org/10.1371/journal.pone.0254905 |
Ejemplares similares
-
Spontaneous gait phase synchronization of human to a wheeled mobile robot with replicating gait-induced upper body oscillating motion
por: Yagi, Satoshi, et al.
Publicado: (2022) -
Emotion category-modulated interpretation bias in perceiving ambiguous facial expressions
por: Todd, Emily, et al.
Publicado: (2023) -
Subthalamic nucleus detects unnatural android movement
por: Ikeda, Takashi, et al.
Publicado: (2017) -
An Android for Emotional Interaction: Spatiotemporal Validation of Its Facial Expressions
por: Sato, Wataru, et al.
Publicado: (2022) -
Implementation and Evaluation of a Grip Behavior Model to Express Emotions for an Android Robot
por: Shiomi, Masahiro, et al.
Publicado: (2021)