Cargando…

Emotion recognition using Kinect motion capture data of human gaits

Automatic emotion recognition is of great value in many applications, however, to fully display the application value of emotion recognition, more portable, non-intrusive, inexpensive technologies need to be developed. Human gaits could reflect the walker’s emotional state, and could be an informati...

Descripción completa

Detalles Bibliográficos
Autores principales: Li, Shun, Cui, Liqing, Zhu, Changye, Li, Baobin, Zhao, Nan, Zhu, Tingshao
Formato: Online Artículo Texto
Lenguaje:English
Publicado: PeerJ Inc. 2016
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5028730/
https://www.ncbi.nlm.nih.gov/pubmed/27672492
http://dx.doi.org/10.7717/peerj.2364
_version_ 1782454385434951680
author Li, Shun
Cui, Liqing
Zhu, Changye
Li, Baobin
Zhao, Nan
Zhu, Tingshao
author_facet Li, Shun
Cui, Liqing
Zhu, Changye
Li, Baobin
Zhao, Nan
Zhu, Tingshao
author_sort Li, Shun
collection PubMed
description Automatic emotion recognition is of great value in many applications, however, to fully display the application value of emotion recognition, more portable, non-intrusive, inexpensive technologies need to be developed. Human gaits could reflect the walker’s emotional state, and could be an information source for emotion recognition. This paper proposed a novel method to recognize emotional state through human gaits by using Microsoft Kinect, a low-cost, portable, camera-based sensor. Fifty-nine participants’ gaits under neutral state, induced anger and induced happiness were recorded by two Kinect cameras, and the original data were processed through joint selection, coordinate system transformation, sliding window gauss filtering, differential operation, and data segmentation. Features of gait patterns were extracted from 3-dimentional coordinates of 14 main body joints by Fourier transformation and Principal Component Analysis (PCA). The classifiers NaiveBayes, RandomForests, LibSVM and SMO (Sequential Minimal Optimization) were trained and evaluated, and the accuracy of recognizing anger and happiness from neutral state achieved 80.5% and 75.4%. Although the results of distinguishing angry and happiness states were not ideal in current study, it showed the feasibility of automatically recognizing emotional states from gaits, with the characteristics meeting the application requirements.
format Online
Article
Text
id pubmed-5028730
institution National Center for Biotechnology Information
language English
publishDate 2016
publisher PeerJ Inc.
record_format MEDLINE/PubMed
spelling pubmed-50287302016-09-26 Emotion recognition using Kinect motion capture data of human gaits Li, Shun Cui, Liqing Zhu, Changye Li, Baobin Zhao, Nan Zhu, Tingshao PeerJ Kinesiology Automatic emotion recognition is of great value in many applications, however, to fully display the application value of emotion recognition, more portable, non-intrusive, inexpensive technologies need to be developed. Human gaits could reflect the walker’s emotional state, and could be an information source for emotion recognition. This paper proposed a novel method to recognize emotional state through human gaits by using Microsoft Kinect, a low-cost, portable, camera-based sensor. Fifty-nine participants’ gaits under neutral state, induced anger and induced happiness were recorded by two Kinect cameras, and the original data were processed through joint selection, coordinate system transformation, sliding window gauss filtering, differential operation, and data segmentation. Features of gait patterns were extracted from 3-dimentional coordinates of 14 main body joints by Fourier transformation and Principal Component Analysis (PCA). The classifiers NaiveBayes, RandomForests, LibSVM and SMO (Sequential Minimal Optimization) were trained and evaluated, and the accuracy of recognizing anger and happiness from neutral state achieved 80.5% and 75.4%. Although the results of distinguishing angry and happiness states were not ideal in current study, it showed the feasibility of automatically recognizing emotional states from gaits, with the characteristics meeting the application requirements. PeerJ Inc. 2016-09-15 /pmc/articles/PMC5028730/ /pubmed/27672492 http://dx.doi.org/10.7717/peerj.2364 Text en ©2016 Li et al. http://creativecommons.org/licenses/by/4.0/ This is an open access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0/) , which permits unrestricted use, distribution, reproduction and adaptation in any medium and for any purpose provided that it is properly attributed. For attribution, the original author(s), title, publication source (PeerJ) and either DOI or URL of the article must be cited.
spellingShingle Kinesiology
Li, Shun
Cui, Liqing
Zhu, Changye
Li, Baobin
Zhao, Nan
Zhu, Tingshao
Emotion recognition using Kinect motion capture data of human gaits
title Emotion recognition using Kinect motion capture data of human gaits
title_full Emotion recognition using Kinect motion capture data of human gaits
title_fullStr Emotion recognition using Kinect motion capture data of human gaits
title_full_unstemmed Emotion recognition using Kinect motion capture data of human gaits
title_short Emotion recognition using Kinect motion capture data of human gaits
title_sort emotion recognition using kinect motion capture data of human gaits
topic Kinesiology
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5028730/
https://www.ncbi.nlm.nih.gov/pubmed/27672492
http://dx.doi.org/10.7717/peerj.2364
work_keys_str_mv AT lishun emotionrecognitionusingkinectmotioncapturedataofhumangaits
AT cuiliqing emotionrecognitionusingkinectmotioncapturedataofhumangaits
AT zhuchangye emotionrecognitionusingkinectmotioncapturedataofhumangaits
AT libaobin emotionrecognitionusingkinectmotioncapturedataofhumangaits
AT zhaonan emotionrecognitionusingkinectmotioncapturedataofhumangaits
AT zhutingshao emotionrecognitionusingkinectmotioncapturedataofhumangaits