Cargando…

Mental states and personality based on real-time physical activity and facial expression recognition

INTRODUCTION: To explore a quick and non-invasive way to measure individual psychological states, this study developed interview-based scales, and multi-modal information was collected from 172 participants. METHODS: We developed the Interview Psychological Symptom Inventory (IPSI) which eventually...

Descripción completa

Detalles Bibliográficos
Autores principales: Huang, Yating, Zhai, Dengyue, Song, Jingze, Rao, Xuanheng, Sun, Xiao, Tang, Jin
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Frontiers Media S.A. 2023
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9868243/
https://www.ncbi.nlm.nih.gov/pubmed/36699483
http://dx.doi.org/10.3389/fpsyt.2022.1019043
_version_ 1784876488805318656
author Huang, Yating
Zhai, Dengyue
Song, Jingze
Rao, Xuanheng
Sun, Xiao
Tang, Jin
author_facet Huang, Yating
Zhai, Dengyue
Song, Jingze
Rao, Xuanheng
Sun, Xiao
Tang, Jin
author_sort Huang, Yating
collection PubMed
description INTRODUCTION: To explore a quick and non-invasive way to measure individual psychological states, this study developed interview-based scales, and multi-modal information was collected from 172 participants. METHODS: We developed the Interview Psychological Symptom Inventory (IPSI) which eventually retained 53 items with nine main factors. All of them performed well in terms of reliability and validity. We used optimized convolutional neural networks and original detection algorithms for the recognition of individual facial expressions and physical activity based on Russell's circumplex model and the five factor model. RESULTS: We found that there was a significant correlation between the developed scale and the participants' scores on each factor in the Symptom Checklist-90 (SCL-90) and Big Five Inventory (BFI-2) [r = (−0.257, 0.632), p < 0.01]. Among the multi-modal data, the arousal of facial expressions was significantly correlated with the interval of validity (p < 0.01), valence was significantly correlated with IPSI and SCL-90, and physical activity was significantly correlated with gender, age, and factors of the scales. DISCUSSION: Our research demonstrates that mental health can be monitored and assessed remotely by collecting and analyzing multimodal data from individuals captured by digital tools.
format Online
Article
Text
id pubmed-9868243
institution National Center for Biotechnology Information
language English
publishDate 2023
publisher Frontiers Media S.A.
record_format MEDLINE/PubMed
spelling pubmed-98682432023-01-24 Mental states and personality based on real-time physical activity and facial expression recognition Huang, Yating Zhai, Dengyue Song, Jingze Rao, Xuanheng Sun, Xiao Tang, Jin Front Psychiatry Psychiatry INTRODUCTION: To explore a quick and non-invasive way to measure individual psychological states, this study developed interview-based scales, and multi-modal information was collected from 172 participants. METHODS: We developed the Interview Psychological Symptom Inventory (IPSI) which eventually retained 53 items with nine main factors. All of them performed well in terms of reliability and validity. We used optimized convolutional neural networks and original detection algorithms for the recognition of individual facial expressions and physical activity based on Russell's circumplex model and the five factor model. RESULTS: We found that there was a significant correlation between the developed scale and the participants' scores on each factor in the Symptom Checklist-90 (SCL-90) and Big Five Inventory (BFI-2) [r = (−0.257, 0.632), p < 0.01]. Among the multi-modal data, the arousal of facial expressions was significantly correlated with the interval of validity (p < 0.01), valence was significantly correlated with IPSI and SCL-90, and physical activity was significantly correlated with gender, age, and factors of the scales. DISCUSSION: Our research demonstrates that mental health can be monitored and assessed remotely by collecting and analyzing multimodal data from individuals captured by digital tools. Frontiers Media S.A. 2023-01-09 /pmc/articles/PMC9868243/ /pubmed/36699483 http://dx.doi.org/10.3389/fpsyt.2022.1019043 Text en Copyright © 2023 Huang, Zhai, Song, Rao, Sun and Tang. https://creativecommons.org/licenses/by/4.0/This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
spellingShingle Psychiatry
Huang, Yating
Zhai, Dengyue
Song, Jingze
Rao, Xuanheng
Sun, Xiao
Tang, Jin
Mental states and personality based on real-time physical activity and facial expression recognition
title Mental states and personality based on real-time physical activity and facial expression recognition
title_full Mental states and personality based on real-time physical activity and facial expression recognition
title_fullStr Mental states and personality based on real-time physical activity and facial expression recognition
title_full_unstemmed Mental states and personality based on real-time physical activity and facial expression recognition
title_short Mental states and personality based on real-time physical activity and facial expression recognition
title_sort mental states and personality based on real-time physical activity and facial expression recognition
topic Psychiatry
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9868243/
https://www.ncbi.nlm.nih.gov/pubmed/36699483
http://dx.doi.org/10.3389/fpsyt.2022.1019043
work_keys_str_mv AT huangyating mentalstatesandpersonalitybasedonrealtimephysicalactivityandfacialexpressionrecognition
AT zhaidengyue mentalstatesandpersonalitybasedonrealtimephysicalactivityandfacialexpressionrecognition
AT songjingze mentalstatesandpersonalitybasedonrealtimephysicalactivityandfacialexpressionrecognition
AT raoxuanheng mentalstatesandpersonalitybasedonrealtimephysicalactivityandfacialexpressionrecognition
AT sunxiao mentalstatesandpersonalitybasedonrealtimephysicalactivityandfacialexpressionrecognition
AT tangjin mentalstatesandpersonalitybasedonrealtimephysicalactivityandfacialexpressionrecognition