Cargando…

EmoTour: Estimating Emotion and Satisfaction of Users Based on Behavioral Cues and Audiovisual Data

With the spread of smart devices, people may obtain a variety of information on their surrounding environment thanks to sensing technologies. To design more context-aware systems, psychological user context (e.g., emotional status) is a substantial factor for providing useful information in an appro...

Descripción completa

Detalles Bibliográficos
Autores principales: Matsuda, Yuki, Fedotov, Dmitrii, Takahashi, Yuta, Arakawa, Yutaka, Yasumoto, Keiichi, Minker, Wolfgang
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2018
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6263657/
https://www.ncbi.nlm.nih.gov/pubmed/30445798
http://dx.doi.org/10.3390/s18113978
_version_ 1783375334966558720
author Matsuda, Yuki
Fedotov, Dmitrii
Takahashi, Yuta
Arakawa, Yutaka
Yasumoto, Keiichi
Minker, Wolfgang
author_facet Matsuda, Yuki
Fedotov, Dmitrii
Takahashi, Yuta
Arakawa, Yutaka
Yasumoto, Keiichi
Minker, Wolfgang
author_sort Matsuda, Yuki
collection PubMed
description With the spread of smart devices, people may obtain a variety of information on their surrounding environment thanks to sensing technologies. To design more context-aware systems, psychological user context (e.g., emotional status) is a substantial factor for providing useful information in an appropriate timing. As a typical use case that has a high demand for context awareness but is not tackled widely yet, we focus on the tourism domain. In this study, we aim to estimate the emotional status and satisfaction level of tourists during sightseeing by using unconscious and natural tourist actions. As tourist actions, behavioral cues (eye and head/body movement) and audiovisual data (facial/vocal expressions) were collected during sightseeing using an eye-gaze tracker, physical-activity sensors, and a smartphone. Then, we derived high-level features, e.g., head tilt and footsteps, from behavioral cues. We also used existing databases of emotionally rich interactions to train emotion-recognition models and apply them in a cross-corpus fashion to generate emotional-state prediction for the audiovisual data. Finally, the features from several modalities are fused to estimate the emotion of tourists during sightseeing. To evaluate our system, we conducted experiments with 22 tourists in two different touristic areas located in Germany and Japan. As a result, we confirmed the feasibility of estimating both the emotional status and satisfaction level of tourists. In addition, we found that effective features used for emotion and satisfaction estimation are different among tourists with different cultural backgrounds.
format Online
Article
Text
id pubmed-6263657
institution National Center for Biotechnology Information
language English
publishDate 2018
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-62636572018-12-12 EmoTour: Estimating Emotion and Satisfaction of Users Based on Behavioral Cues and Audiovisual Data Matsuda, Yuki Fedotov, Dmitrii Takahashi, Yuta Arakawa, Yutaka Yasumoto, Keiichi Minker, Wolfgang Sensors (Basel) Article With the spread of smart devices, people may obtain a variety of information on their surrounding environment thanks to sensing technologies. To design more context-aware systems, psychological user context (e.g., emotional status) is a substantial factor for providing useful information in an appropriate timing. As a typical use case that has a high demand for context awareness but is not tackled widely yet, we focus on the tourism domain. In this study, we aim to estimate the emotional status and satisfaction level of tourists during sightseeing by using unconscious and natural tourist actions. As tourist actions, behavioral cues (eye and head/body movement) and audiovisual data (facial/vocal expressions) were collected during sightseeing using an eye-gaze tracker, physical-activity sensors, and a smartphone. Then, we derived high-level features, e.g., head tilt and footsteps, from behavioral cues. We also used existing databases of emotionally rich interactions to train emotion-recognition models and apply them in a cross-corpus fashion to generate emotional-state prediction for the audiovisual data. Finally, the features from several modalities are fused to estimate the emotion of tourists during sightseeing. To evaluate our system, we conducted experiments with 22 tourists in two different touristic areas located in Germany and Japan. As a result, we confirmed the feasibility of estimating both the emotional status and satisfaction level of tourists. In addition, we found that effective features used for emotion and satisfaction estimation are different among tourists with different cultural backgrounds. MDPI 2018-11-15 /pmc/articles/PMC6263657/ /pubmed/30445798 http://dx.doi.org/10.3390/s18113978 Text en © 2018 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
spellingShingle Article
Matsuda, Yuki
Fedotov, Dmitrii
Takahashi, Yuta
Arakawa, Yutaka
Yasumoto, Keiichi
Minker, Wolfgang
EmoTour: Estimating Emotion and Satisfaction of Users Based on Behavioral Cues and Audiovisual Data
title EmoTour: Estimating Emotion and Satisfaction of Users Based on Behavioral Cues and Audiovisual Data
title_full EmoTour: Estimating Emotion and Satisfaction of Users Based on Behavioral Cues and Audiovisual Data
title_fullStr EmoTour: Estimating Emotion and Satisfaction of Users Based on Behavioral Cues and Audiovisual Data
title_full_unstemmed EmoTour: Estimating Emotion and Satisfaction of Users Based on Behavioral Cues and Audiovisual Data
title_short EmoTour: Estimating Emotion and Satisfaction of Users Based on Behavioral Cues and Audiovisual Data
title_sort emotour: estimating emotion and satisfaction of users based on behavioral cues and audiovisual data
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6263657/
https://www.ncbi.nlm.nih.gov/pubmed/30445798
http://dx.doi.org/10.3390/s18113978
work_keys_str_mv AT matsudayuki emotourestimatingemotionandsatisfactionofusersbasedonbehavioralcuesandaudiovisualdata
AT fedotovdmitrii emotourestimatingemotionandsatisfactionofusersbasedonbehavioralcuesandaudiovisualdata
AT takahashiyuta emotourestimatingemotionandsatisfactionofusersbasedonbehavioralcuesandaudiovisualdata
AT arakawayutaka emotourestimatingemotionandsatisfactionofusersbasedonbehavioralcuesandaudiovisualdata
AT yasumotokeiichi emotourestimatingemotionandsatisfactionofusersbasedonbehavioralcuesandaudiovisualdata
AT minkerwolfgang emotourestimatingemotionandsatisfactionofusersbasedonbehavioralcuesandaudiovisualdata