Cargando…

Enhancing biofeedback-driven self-guided virtual reality exposure therapy through arousal detection from multimodal data using machine learning

Virtual reality exposure therapy (VRET) is a novel intervention technique that allows individuals to experience anxiety-evoking stimuli in a safe environment, recognise specific triggers and gradually increase their exposure to perceived threats. Public-speaking anxiety (PSA) is a prevalent form of...

Descripción completa

Detalles Bibliográficos
Autores principales: Rahman, Muhammad Arifur, Brown, David J., Mahmud, Mufti, Harris, Matthew, Shopland, Nicholas, Heym, Nadja, Sumich, Alexander, Turabee, Zakia Batool, Standen, Bradley, Downes, David, Xing, Yangang, Thomas, Carolyn, Haddick, Sean, Premkumar, Preethi, Nastase, Simona, Burton, Andrew, Lewis, James
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Springer Berlin Heidelberg 2023
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10284788/
https://www.ncbi.nlm.nih.gov/pubmed/37341863
http://dx.doi.org/10.1186/s40708-023-00193-9
_version_ 1785061467699019776
author Rahman, Muhammad Arifur
Brown, David J.
Mahmud, Mufti
Harris, Matthew
Shopland, Nicholas
Heym, Nadja
Sumich, Alexander
Turabee, Zakia Batool
Standen, Bradley
Downes, David
Xing, Yangang
Thomas, Carolyn
Haddick, Sean
Premkumar, Preethi
Nastase, Simona
Burton, Andrew
Lewis, James
author_facet Rahman, Muhammad Arifur
Brown, David J.
Mahmud, Mufti
Harris, Matthew
Shopland, Nicholas
Heym, Nadja
Sumich, Alexander
Turabee, Zakia Batool
Standen, Bradley
Downes, David
Xing, Yangang
Thomas, Carolyn
Haddick, Sean
Premkumar, Preethi
Nastase, Simona
Burton, Andrew
Lewis, James
author_sort Rahman, Muhammad Arifur
collection PubMed
description Virtual reality exposure therapy (VRET) is a novel intervention technique that allows individuals to experience anxiety-evoking stimuli in a safe environment, recognise specific triggers and gradually increase their exposure to perceived threats. Public-speaking anxiety (PSA) is a prevalent form of social anxiety, characterised by stressful arousal and anxiety generated when presenting to an audience. In self-guided VRET, participants can gradually increase their tolerance to exposure and reduce anxiety-induced arousal and PSA over time. However, creating such a VR environment and determining physiological indices of anxiety-induced arousal or distress is an open challenge. Environment modelling, character creation and animation, psychological state determination and the use of machine learning (ML) models for anxiety or stress detection are equally important, and multi-disciplinary expertise is required. In this work, we have explored a series of ML models with publicly available data sets (using electroencephalogram and heart rate variability) to predict arousal states. If we can detect anxiety-induced arousal, we can trigger calming activities to allow individuals to cope with and overcome distress. Here, we discuss the means of effective selection of ML models and parameters in arousal detection. We propose a pipeline to overcome the model selection problem with different parameter settings in the context of virtual reality exposure therapy. This pipeline can be extended to other domains of interest where arousal detection is crucial. Finally, we have implemented a biofeedback framework for VRET where we successfully provided feedback as a form of heart rate and brain laterality index from our acquired multimodal data for psychological intervention to overcome anxiety.
format Online
Article
Text
id pubmed-10284788
institution National Center for Biotechnology Information
language English
publishDate 2023
publisher Springer Berlin Heidelberg
record_format MEDLINE/PubMed
spelling pubmed-102847882023-06-23 Enhancing biofeedback-driven self-guided virtual reality exposure therapy through arousal detection from multimodal data using machine learning Rahman, Muhammad Arifur Brown, David J. Mahmud, Mufti Harris, Matthew Shopland, Nicholas Heym, Nadja Sumich, Alexander Turabee, Zakia Batool Standen, Bradley Downes, David Xing, Yangang Thomas, Carolyn Haddick, Sean Premkumar, Preethi Nastase, Simona Burton, Andrew Lewis, James Brain Inform Research Virtual reality exposure therapy (VRET) is a novel intervention technique that allows individuals to experience anxiety-evoking stimuli in a safe environment, recognise specific triggers and gradually increase their exposure to perceived threats. Public-speaking anxiety (PSA) is a prevalent form of social anxiety, characterised by stressful arousal and anxiety generated when presenting to an audience. In self-guided VRET, participants can gradually increase their tolerance to exposure and reduce anxiety-induced arousal and PSA over time. However, creating such a VR environment and determining physiological indices of anxiety-induced arousal or distress is an open challenge. Environment modelling, character creation and animation, psychological state determination and the use of machine learning (ML) models for anxiety or stress detection are equally important, and multi-disciplinary expertise is required. In this work, we have explored a series of ML models with publicly available data sets (using electroencephalogram and heart rate variability) to predict arousal states. If we can detect anxiety-induced arousal, we can trigger calming activities to allow individuals to cope with and overcome distress. Here, we discuss the means of effective selection of ML models and parameters in arousal detection. We propose a pipeline to overcome the model selection problem with different parameter settings in the context of virtual reality exposure therapy. This pipeline can be extended to other domains of interest where arousal detection is crucial. Finally, we have implemented a biofeedback framework for VRET where we successfully provided feedback as a form of heart rate and brain laterality index from our acquired multimodal data for psychological intervention to overcome anxiety. Springer Berlin Heidelberg 2023-06-21 /pmc/articles/PMC10284788/ /pubmed/37341863 http://dx.doi.org/10.1186/s40708-023-00193-9 Text en © The Author(s) 2023 https://creativecommons.org/licenses/by/4.0/Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ (https://creativecommons.org/licenses/by/4.0/) .
spellingShingle Research
Rahman, Muhammad Arifur
Brown, David J.
Mahmud, Mufti
Harris, Matthew
Shopland, Nicholas
Heym, Nadja
Sumich, Alexander
Turabee, Zakia Batool
Standen, Bradley
Downes, David
Xing, Yangang
Thomas, Carolyn
Haddick, Sean
Premkumar, Preethi
Nastase, Simona
Burton, Andrew
Lewis, James
Enhancing biofeedback-driven self-guided virtual reality exposure therapy through arousal detection from multimodal data using machine learning
title Enhancing biofeedback-driven self-guided virtual reality exposure therapy through arousal detection from multimodal data using machine learning
title_full Enhancing biofeedback-driven self-guided virtual reality exposure therapy through arousal detection from multimodal data using machine learning
title_fullStr Enhancing biofeedback-driven self-guided virtual reality exposure therapy through arousal detection from multimodal data using machine learning
title_full_unstemmed Enhancing biofeedback-driven self-guided virtual reality exposure therapy through arousal detection from multimodal data using machine learning
title_short Enhancing biofeedback-driven self-guided virtual reality exposure therapy through arousal detection from multimodal data using machine learning
title_sort enhancing biofeedback-driven self-guided virtual reality exposure therapy through arousal detection from multimodal data using machine learning
topic Research
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10284788/
https://www.ncbi.nlm.nih.gov/pubmed/37341863
http://dx.doi.org/10.1186/s40708-023-00193-9
work_keys_str_mv AT rahmanmuhammadarifur enhancingbiofeedbackdrivenselfguidedvirtualrealityexposuretherapythrougharousaldetectionfrommultimodaldatausingmachinelearning
AT browndavidj enhancingbiofeedbackdrivenselfguidedvirtualrealityexposuretherapythrougharousaldetectionfrommultimodaldatausingmachinelearning
AT mahmudmufti enhancingbiofeedbackdrivenselfguidedvirtualrealityexposuretherapythrougharousaldetectionfrommultimodaldatausingmachinelearning
AT harrismatthew enhancingbiofeedbackdrivenselfguidedvirtualrealityexposuretherapythrougharousaldetectionfrommultimodaldatausingmachinelearning
AT shoplandnicholas enhancingbiofeedbackdrivenselfguidedvirtualrealityexposuretherapythrougharousaldetectionfrommultimodaldatausingmachinelearning
AT heymnadja enhancingbiofeedbackdrivenselfguidedvirtualrealityexposuretherapythrougharousaldetectionfrommultimodaldatausingmachinelearning
AT sumichalexander enhancingbiofeedbackdrivenselfguidedvirtualrealityexposuretherapythrougharousaldetectionfrommultimodaldatausingmachinelearning
AT turabeezakiabatool enhancingbiofeedbackdrivenselfguidedvirtualrealityexposuretherapythrougharousaldetectionfrommultimodaldatausingmachinelearning
AT standenbradley enhancingbiofeedbackdrivenselfguidedvirtualrealityexposuretherapythrougharousaldetectionfrommultimodaldatausingmachinelearning
AT downesdavid enhancingbiofeedbackdrivenselfguidedvirtualrealityexposuretherapythrougharousaldetectionfrommultimodaldatausingmachinelearning
AT xingyangang enhancingbiofeedbackdrivenselfguidedvirtualrealityexposuretherapythrougharousaldetectionfrommultimodaldatausingmachinelearning
AT thomascarolyn enhancingbiofeedbackdrivenselfguidedvirtualrealityexposuretherapythrougharousaldetectionfrommultimodaldatausingmachinelearning
AT haddicksean enhancingbiofeedbackdrivenselfguidedvirtualrealityexposuretherapythrougharousaldetectionfrommultimodaldatausingmachinelearning
AT premkumarpreethi enhancingbiofeedbackdrivenselfguidedvirtualrealityexposuretherapythrougharousaldetectionfrommultimodaldatausingmachinelearning
AT nastasesimona enhancingbiofeedbackdrivenselfguidedvirtualrealityexposuretherapythrougharousaldetectionfrommultimodaldatausingmachinelearning
AT burtonandrew enhancingbiofeedbackdrivenselfguidedvirtualrealityexposuretherapythrougharousaldetectionfrommultimodaldatausingmachinelearning
AT lewisjames enhancingbiofeedbackdrivenselfguidedvirtualrealityexposuretherapythrougharousaldetectionfrommultimodaldatausingmachinelearning