Cargando…

Facing the FACS—Using AI to Evaluate and Control Facial Action Units in Humanoid Robot Face Development

This paper presents a new approach for evaluating and controlling expressive humanoid robotic faces using open-source computer vision and machine learning methods. Existing research in Human-Robot Interaction lacks flexible and simple tools that are scalable for evaluating and controlling various ro...

Descripción completa

Detalles Bibliográficos
Autores principales: Auflem, Marius, Kohtala, Sampsa, Jung, Malte, Steinert, Martin
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Frontiers Media S.A. 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9237251/
https://www.ncbi.nlm.nih.gov/pubmed/35774595
http://dx.doi.org/10.3389/frobt.2022.887645
_version_ 1784736737967210496
author Auflem, Marius
Kohtala, Sampsa
Jung, Malte
Steinert, Martin
author_facet Auflem, Marius
Kohtala, Sampsa
Jung, Malte
Steinert, Martin
author_sort Auflem, Marius
collection PubMed
description This paper presents a new approach for evaluating and controlling expressive humanoid robotic faces using open-source computer vision and machine learning methods. Existing research in Human-Robot Interaction lacks flexible and simple tools that are scalable for evaluating and controlling various robotic faces; thus, our goal is to demonstrate the use of readily available AI-based solutions to support the process. We use a newly developed humanoid robot prototype intended for medical training applications as a case example. The approach automatically captures the robot’s facial action units through a webcam during random motion, which are components traditionally used to describe facial muscle movements in humans. Instead of manipulating the actuators individually or training the robot to express specific emotions, we propose using action units as a means for controlling the robotic face, which enables a multitude of ways to generate dynamic motion, expressions, and behavior. The range of action units achieved by the robot is thus analyzed to discover its expressive capabilities and limitations and to develop a control model by correlating action units to actuation parameters. Because the approach is not dependent on specific facial attributes or actuation capabilities, it can be used for different designs and continuously inform the development process. In healthcare training applications, our goal is to establish a prerequisite of expressive capabilities of humanoid robots bounded by industrial and medical design constraints. Furthermore, to mediate human interpretation and thus enable decision-making based on observed cognitive, emotional, and expressive cues, our approach aims to find the minimum viable expressive capabilities of the robot without having to optimize for realism. The results from our case example demonstrate the flexibility and efficiency of the presented AI-based solutions to support the development of humanoid facial robots.
format Online
Article
Text
id pubmed-9237251
institution National Center for Biotechnology Information
language English
publishDate 2022
publisher Frontiers Media S.A.
record_format MEDLINE/PubMed
spelling pubmed-92372512022-06-29 Facing the FACS—Using AI to Evaluate and Control Facial Action Units in Humanoid Robot Face Development Auflem, Marius Kohtala, Sampsa Jung, Malte Steinert, Martin Front Robot AI Robotics and AI This paper presents a new approach for evaluating and controlling expressive humanoid robotic faces using open-source computer vision and machine learning methods. Existing research in Human-Robot Interaction lacks flexible and simple tools that are scalable for evaluating and controlling various robotic faces; thus, our goal is to demonstrate the use of readily available AI-based solutions to support the process. We use a newly developed humanoid robot prototype intended for medical training applications as a case example. The approach automatically captures the robot’s facial action units through a webcam during random motion, which are components traditionally used to describe facial muscle movements in humans. Instead of manipulating the actuators individually or training the robot to express specific emotions, we propose using action units as a means for controlling the robotic face, which enables a multitude of ways to generate dynamic motion, expressions, and behavior. The range of action units achieved by the robot is thus analyzed to discover its expressive capabilities and limitations and to develop a control model by correlating action units to actuation parameters. Because the approach is not dependent on specific facial attributes or actuation capabilities, it can be used for different designs and continuously inform the development process. In healthcare training applications, our goal is to establish a prerequisite of expressive capabilities of humanoid robots bounded by industrial and medical design constraints. Furthermore, to mediate human interpretation and thus enable decision-making based on observed cognitive, emotional, and expressive cues, our approach aims to find the minimum viable expressive capabilities of the robot without having to optimize for realism. The results from our case example demonstrate the flexibility and efficiency of the presented AI-based solutions to support the development of humanoid facial robots. Frontiers Media S.A. 2022-06-14 /pmc/articles/PMC9237251/ /pubmed/35774595 http://dx.doi.org/10.3389/frobt.2022.887645 Text en Copyright © 2022 Auflem, Kohtala, Jung and Steinert. https://creativecommons.org/licenses/by/4.0/This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
spellingShingle Robotics and AI
Auflem, Marius
Kohtala, Sampsa
Jung, Malte
Steinert, Martin
Facing the FACS—Using AI to Evaluate and Control Facial Action Units in Humanoid Robot Face Development
title Facing the FACS—Using AI to Evaluate and Control Facial Action Units in Humanoid Robot Face Development
title_full Facing the FACS—Using AI to Evaluate and Control Facial Action Units in Humanoid Robot Face Development
title_fullStr Facing the FACS—Using AI to Evaluate and Control Facial Action Units in Humanoid Robot Face Development
title_full_unstemmed Facing the FACS—Using AI to Evaluate and Control Facial Action Units in Humanoid Robot Face Development
title_short Facing the FACS—Using AI to Evaluate and Control Facial Action Units in Humanoid Robot Face Development
title_sort facing the facs—using ai to evaluate and control facial action units in humanoid robot face development
topic Robotics and AI
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9237251/
https://www.ncbi.nlm.nih.gov/pubmed/35774595
http://dx.doi.org/10.3389/frobt.2022.887645
work_keys_str_mv AT auflemmarius facingthefacsusingaitoevaluateandcontrolfacialactionunitsinhumanoidrobotfacedevelopment
AT kohtalasampsa facingthefacsusingaitoevaluateandcontrolfacialactionunitsinhumanoidrobotfacedevelopment
AT jungmalte facingthefacsusingaitoevaluateandcontrolfacialactionunitsinhumanoidrobotfacedevelopment
AT steinertmartin facingthefacsusingaitoevaluateandcontrolfacialactionunitsinhumanoidrobotfacedevelopment