Cargando…

Screening for Mild Cognitive Impairment with Speech Interaction Based on Virtual Reality and Wearable Devices

Significant advances in sensor technology and virtual reality (VR) offer new possibilities for early and effective detection of mild cognitive impairment (MCI), and this wealth of data can improve the early detection and monitoring of patients. In this study, we proposed a non-invasive and effective...

Descripción completa

Detalles Bibliográficos
Autores principales: Wu, Ruixuan, Li, Aoyu, Xue, Chen, Chai, Jiali, Qiang, Yan, Zhao, Juanjuan, Wang, Long
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2023
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10452416/
https://www.ncbi.nlm.nih.gov/pubmed/37626578
http://dx.doi.org/10.3390/brainsci13081222
_version_ 1785095665908449280
author Wu, Ruixuan
Li, Aoyu
Xue, Chen
Chai, Jiali
Qiang, Yan
Zhao, Juanjuan
Wang, Long
author_facet Wu, Ruixuan
Li, Aoyu
Xue, Chen
Chai, Jiali
Qiang, Yan
Zhao, Juanjuan
Wang, Long
author_sort Wu, Ruixuan
collection PubMed
description Significant advances in sensor technology and virtual reality (VR) offer new possibilities for early and effective detection of mild cognitive impairment (MCI), and this wealth of data can improve the early detection and monitoring of patients. In this study, we proposed a non-invasive and effective MCI detection protocol based on electroencephalogram (EEG), speech, and digitized cognitive parameters. The EEG data, speech data, and digitized cognitive parameters of 86 participants (44 MCI patients and 42 healthy individuals) were monitored using a wearable EEG device and a VR device during the resting state and task (the VR-based language task we designed). Regarding the features selected under different modality combinations for all language tasks, we performed leave-one-out cross-validation for them using four different classifiers. We then compared the classification performance under multimodal data fusion using features from a single language task, features from all tasks, and using a weighted voting strategy, respectively. The experimental results showed that the collaborative screening of multimodal data yielded the highest classification performance compared to single-modal features. Among them, the SVM classifier using the RBF kernel obtained the best classification results with an accuracy of 87%. The overall classification performance was further improved using a weighted voting strategy with an accuracy of 89.8%, indicating that our proposed method can tap into the cognitive changes of MCI patients. The MCI detection scheme based on EEG, speech, and digital cognitive parameters proposed in this study provides a new direction and support for effective MCI detection, and suggests that VR and wearable devices will be a promising direction for easy-to-perform and effective MCI detection, offering new possibilities for the exploration of VR technology in the field of language cognition.
format Online
Article
Text
id pubmed-10452416
institution National Center for Biotechnology Information
language English
publishDate 2023
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-104524162023-08-26 Screening for Mild Cognitive Impairment with Speech Interaction Based on Virtual Reality and Wearable Devices Wu, Ruixuan Li, Aoyu Xue, Chen Chai, Jiali Qiang, Yan Zhao, Juanjuan Wang, Long Brain Sci Article Significant advances in sensor technology and virtual reality (VR) offer new possibilities for early and effective detection of mild cognitive impairment (MCI), and this wealth of data can improve the early detection and monitoring of patients. In this study, we proposed a non-invasive and effective MCI detection protocol based on electroencephalogram (EEG), speech, and digitized cognitive parameters. The EEG data, speech data, and digitized cognitive parameters of 86 participants (44 MCI patients and 42 healthy individuals) were monitored using a wearable EEG device and a VR device during the resting state and task (the VR-based language task we designed). Regarding the features selected under different modality combinations for all language tasks, we performed leave-one-out cross-validation for them using four different classifiers. We then compared the classification performance under multimodal data fusion using features from a single language task, features from all tasks, and using a weighted voting strategy, respectively. The experimental results showed that the collaborative screening of multimodal data yielded the highest classification performance compared to single-modal features. Among them, the SVM classifier using the RBF kernel obtained the best classification results with an accuracy of 87%. The overall classification performance was further improved using a weighted voting strategy with an accuracy of 89.8%, indicating that our proposed method can tap into the cognitive changes of MCI patients. The MCI detection scheme based on EEG, speech, and digital cognitive parameters proposed in this study provides a new direction and support for effective MCI detection, and suggests that VR and wearable devices will be a promising direction for easy-to-perform and effective MCI detection, offering new possibilities for the exploration of VR technology in the field of language cognition. MDPI 2023-08-21 /pmc/articles/PMC10452416/ /pubmed/37626578 http://dx.doi.org/10.3390/brainsci13081222 Text en © 2023 by the authors. https://creativecommons.org/licenses/by/4.0/Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
spellingShingle Article
Wu, Ruixuan
Li, Aoyu
Xue, Chen
Chai, Jiali
Qiang, Yan
Zhao, Juanjuan
Wang, Long
Screening for Mild Cognitive Impairment with Speech Interaction Based on Virtual Reality and Wearable Devices
title Screening for Mild Cognitive Impairment with Speech Interaction Based on Virtual Reality and Wearable Devices
title_full Screening for Mild Cognitive Impairment with Speech Interaction Based on Virtual Reality and Wearable Devices
title_fullStr Screening for Mild Cognitive Impairment with Speech Interaction Based on Virtual Reality and Wearable Devices
title_full_unstemmed Screening for Mild Cognitive Impairment with Speech Interaction Based on Virtual Reality and Wearable Devices
title_short Screening for Mild Cognitive Impairment with Speech Interaction Based on Virtual Reality and Wearable Devices
title_sort screening for mild cognitive impairment with speech interaction based on virtual reality and wearable devices
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10452416/
https://www.ncbi.nlm.nih.gov/pubmed/37626578
http://dx.doi.org/10.3390/brainsci13081222
work_keys_str_mv AT wuruixuan screeningformildcognitiveimpairmentwithspeechinteractionbasedonvirtualrealityandwearabledevices
AT liaoyu screeningformildcognitiveimpairmentwithspeechinteractionbasedonvirtualrealityandwearabledevices
AT xuechen screeningformildcognitiveimpairmentwithspeechinteractionbasedonvirtualrealityandwearabledevices
AT chaijiali screeningformildcognitiveimpairmentwithspeechinteractionbasedonvirtualrealityandwearabledevices
AT qiangyan screeningformildcognitiveimpairmentwithspeechinteractionbasedonvirtualrealityandwearabledevices
AT zhaojuanjuan screeningformildcognitiveimpairmentwithspeechinteractionbasedonvirtualrealityandwearabledevices
AT wanglong screeningformildcognitiveimpairmentwithspeechinteractionbasedonvirtualrealityandwearabledevices