Cargando…

Prediction of Continuous Emotional Measures through Physiological and Visual Data †

The affective state of a person can be measured using arousal and valence values. In this article, we contribute to the prediction of arousal and valence values from various data sources. Our goal is to later use such predictive models to adaptively adjust virtual reality (VR) environments and help...

Descripción completa

Detalles Bibliográficos
Autores principales: Joudeh, Itaf Omar, Cretu, Ana-Maria, Bouchard, Stéphane, Guimond, Synthia
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2023
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10303095/
https://www.ncbi.nlm.nih.gov/pubmed/37420778
http://dx.doi.org/10.3390/s23125613
_version_ 1785065197417791488
author Joudeh, Itaf Omar
Cretu, Ana-Maria
Bouchard, Stéphane
Guimond, Synthia
author_facet Joudeh, Itaf Omar
Cretu, Ana-Maria
Bouchard, Stéphane
Guimond, Synthia
author_sort Joudeh, Itaf Omar
collection PubMed
description The affective state of a person can be measured using arousal and valence values. In this article, we contribute to the prediction of arousal and valence values from various data sources. Our goal is to later use such predictive models to adaptively adjust virtual reality (VR) environments and help facilitate cognitive remediation exercises for users with mental health disorders, such as schizophrenia, while avoiding discouragement. Building on our previous work on physiological, electrodermal activity (EDA) and electrocardiogram (ECG) recordings, we propose improving preprocessing and adding novel feature selection and decision fusion processes. We use video recordings as an additional data source for predicting affective states. We implement an innovative solution based on a combination of machine learning models alongside a series of preprocessing steps. We test our approach on RECOLA, a publicly available dataset. The best results are obtained with a concordance correlation coefficient (CCC) of 0.996 for arousal and 0.998 for valence using physiological data. Related work in the literature reported lower CCCs on the same data modality; thus, our approach outperforms the state-of-the-art approaches for RECOLA. Our study underscores the potential of using advanced machine learning techniques with diverse data sources to enhance the personalization of VR environments.
format Online
Article
Text
id pubmed-10303095
institution National Center for Biotechnology Information
language English
publishDate 2023
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-103030952023-06-29 Prediction of Continuous Emotional Measures through Physiological and Visual Data † Joudeh, Itaf Omar Cretu, Ana-Maria Bouchard, Stéphane Guimond, Synthia Sensors (Basel) Article The affective state of a person can be measured using arousal and valence values. In this article, we contribute to the prediction of arousal and valence values from various data sources. Our goal is to later use such predictive models to adaptively adjust virtual reality (VR) environments and help facilitate cognitive remediation exercises for users with mental health disorders, such as schizophrenia, while avoiding discouragement. Building on our previous work on physiological, electrodermal activity (EDA) and electrocardiogram (ECG) recordings, we propose improving preprocessing and adding novel feature selection and decision fusion processes. We use video recordings as an additional data source for predicting affective states. We implement an innovative solution based on a combination of machine learning models alongside a series of preprocessing steps. We test our approach on RECOLA, a publicly available dataset. The best results are obtained with a concordance correlation coefficient (CCC) of 0.996 for arousal and 0.998 for valence using physiological data. Related work in the literature reported lower CCCs on the same data modality; thus, our approach outperforms the state-of-the-art approaches for RECOLA. Our study underscores the potential of using advanced machine learning techniques with diverse data sources to enhance the personalization of VR environments. MDPI 2023-06-15 /pmc/articles/PMC10303095/ /pubmed/37420778 http://dx.doi.org/10.3390/s23125613 Text en © 2023 by the authors. https://creativecommons.org/licenses/by/4.0/Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
spellingShingle Article
Joudeh, Itaf Omar
Cretu, Ana-Maria
Bouchard, Stéphane
Guimond, Synthia
Prediction of Continuous Emotional Measures through Physiological and Visual Data †
title Prediction of Continuous Emotional Measures through Physiological and Visual Data †
title_full Prediction of Continuous Emotional Measures through Physiological and Visual Data †
title_fullStr Prediction of Continuous Emotional Measures through Physiological and Visual Data †
title_full_unstemmed Prediction of Continuous Emotional Measures through Physiological and Visual Data †
title_short Prediction of Continuous Emotional Measures through Physiological and Visual Data †
title_sort prediction of continuous emotional measures through physiological and visual data †
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10303095/
https://www.ncbi.nlm.nih.gov/pubmed/37420778
http://dx.doi.org/10.3390/s23125613
work_keys_str_mv AT joudehitafomar predictionofcontinuousemotionalmeasuresthroughphysiologicalandvisualdata
AT cretuanamaria predictionofcontinuousemotionalmeasuresthroughphysiologicalandvisualdata
AT bouchardstephane predictionofcontinuousemotionalmeasuresthroughphysiologicalandvisualdata
AT guimondsynthia predictionofcontinuousemotionalmeasuresthroughphysiologicalandvisualdata