Cargando…

Emotion Recognition Using Smart Watch Sensor Data: Mixed-Design Study

BACKGROUND: Research in psychology has shown that the way a person walks reflects that person’s current mood (or emotional state). Recent studies have used mobile phones to detect emotional states from movement data. OBJECTIVE: The objective of our study was to investigate the use of movement sensor...

Descripción completa

Detalles Bibliográficos
Autores principales: Quiroz, Juan Carlos, Geangu, Elena, Yong, Min Hooi
Formato: Online Artículo Texto
Lenguaje:English
Publicado: JMIR Publications 2018
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6105867/
https://www.ncbi.nlm.nih.gov/pubmed/30089610
http://dx.doi.org/10.2196/10153
_version_ 1783349698434695168
author Quiroz, Juan Carlos
Geangu, Elena
Yong, Min Hooi
author_facet Quiroz, Juan Carlos
Geangu, Elena
Yong, Min Hooi
author_sort Quiroz, Juan Carlos
collection PubMed
description BACKGROUND: Research in psychology has shown that the way a person walks reflects that person’s current mood (or emotional state). Recent studies have used mobile phones to detect emotional states from movement data. OBJECTIVE: The objective of our study was to investigate the use of movement sensor data from a smart watch to infer an individual’s emotional state. We present our findings of a user study with 50 participants. METHODS: The experimental design is a mixed-design study: within-subjects (emotions: happy, sad, and neutral) and between-subjects (stimulus type: audiovisual “movie clips” and audio “music clips”). Each participant experienced both emotions in a single stimulus type. All participants walked 250 m while wearing a smart watch on one wrist and a heart rate monitor strap on the chest. They also had to answer a short questionnaire (20 items; Positive Affect and Negative Affect Schedule, PANAS) before and after experiencing each emotion. The data obtained from the heart rate monitor served as supplementary information to our data. We performed time series analysis on data from the smart watch and a t test on questionnaire items to measure the change in emotional state. Heart rate data was analyzed using one-way analysis of variance. We extracted features from the time series using sliding windows and used features to train and validate classifiers that determined an individual’s emotion. RESULTS: Overall, 50 young adults participated in our study; of them, 49 were included for the affective PANAS questionnaire and 44 for the feature extraction and building of personal models. Participants reported feeling less negative affect after watching sad videos or after listening to sad music, P<.006. For the task of emotion recognition using classifiers, our results showed that personal models outperformed personal baselines and achieved median accuracies higher than 78% for all conditions of the design study for binary classification of happiness versus sadness. CONCLUSIONS: Our findings show that we are able to detect changes in the emotional state as well as in behavioral responses with data obtained from the smartwatch. Together with high accuracies achieved across all users for classification of happy versus sad emotional states, this is further evidence for the hypothesis that movement sensor data can be used for emotion recognition.
format Online
Article
Text
id pubmed-6105867
institution National Center for Biotechnology Information
language English
publishDate 2018
publisher JMIR Publications
record_format MEDLINE/PubMed
spelling pubmed-61058672018-08-30 Emotion Recognition Using Smart Watch Sensor Data: Mixed-Design Study Quiroz, Juan Carlos Geangu, Elena Yong, Min Hooi JMIR Ment Health Original Paper BACKGROUND: Research in psychology has shown that the way a person walks reflects that person’s current mood (or emotional state). Recent studies have used mobile phones to detect emotional states from movement data. OBJECTIVE: The objective of our study was to investigate the use of movement sensor data from a smart watch to infer an individual’s emotional state. We present our findings of a user study with 50 participants. METHODS: The experimental design is a mixed-design study: within-subjects (emotions: happy, sad, and neutral) and between-subjects (stimulus type: audiovisual “movie clips” and audio “music clips”). Each participant experienced both emotions in a single stimulus type. All participants walked 250 m while wearing a smart watch on one wrist and a heart rate monitor strap on the chest. They also had to answer a short questionnaire (20 items; Positive Affect and Negative Affect Schedule, PANAS) before and after experiencing each emotion. The data obtained from the heart rate monitor served as supplementary information to our data. We performed time series analysis on data from the smart watch and a t test on questionnaire items to measure the change in emotional state. Heart rate data was analyzed using one-way analysis of variance. We extracted features from the time series using sliding windows and used features to train and validate classifiers that determined an individual’s emotion. RESULTS: Overall, 50 young adults participated in our study; of them, 49 were included for the affective PANAS questionnaire and 44 for the feature extraction and building of personal models. Participants reported feeling less negative affect after watching sad videos or after listening to sad music, P<.006. For the task of emotion recognition using classifiers, our results showed that personal models outperformed personal baselines and achieved median accuracies higher than 78% for all conditions of the design study for binary classification of happiness versus sadness. CONCLUSIONS: Our findings show that we are able to detect changes in the emotional state as well as in behavioral responses with data obtained from the smartwatch. Together with high accuracies achieved across all users for classification of happy versus sad emotional states, this is further evidence for the hypothesis that movement sensor data can be used for emotion recognition. JMIR Publications 2018-08-08 /pmc/articles/PMC6105867/ /pubmed/30089610 http://dx.doi.org/10.2196/10153 Text en ©Juan Carlos Quiroz, Elena Geangu, Min Hooi Yong. Originally published in JMIR Mental Health (http://mental.jmir.org), 08.08.2018. https://creativecommons.org/licenses/by/4.0/This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in JMIR Mental Health, is properly cited. The complete bibliographic information, a link to the original publication on http://mental.jmir.org/, as well as this copyright and license information must be included.
spellingShingle Original Paper
Quiroz, Juan Carlos
Geangu, Elena
Yong, Min Hooi
Emotion Recognition Using Smart Watch Sensor Data: Mixed-Design Study
title Emotion Recognition Using Smart Watch Sensor Data: Mixed-Design Study
title_full Emotion Recognition Using Smart Watch Sensor Data: Mixed-Design Study
title_fullStr Emotion Recognition Using Smart Watch Sensor Data: Mixed-Design Study
title_full_unstemmed Emotion Recognition Using Smart Watch Sensor Data: Mixed-Design Study
title_short Emotion Recognition Using Smart Watch Sensor Data: Mixed-Design Study
title_sort emotion recognition using smart watch sensor data: mixed-design study
topic Original Paper
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6105867/
https://www.ncbi.nlm.nih.gov/pubmed/30089610
http://dx.doi.org/10.2196/10153
work_keys_str_mv AT quirozjuancarlos emotionrecognitionusingsmartwatchsensordatamixeddesignstudy
AT geanguelena emotionrecognitionusingsmartwatchsensordatamixeddesignstudy
AT yongminhooi emotionrecognitionusingsmartwatchsensordatamixeddesignstudy