Cargando…

Semi-Supervised Behavior Labeling Using Multimodal Data during Virtual Teamwork-Based Collaborative Activities

Adaptive human–computer systems require the recognition of human behavior states to provide real-time feedback to scaffold skill learning. These systems are being researched extensively for intervention and training in individuals with autism spectrum disorder (ASD). Autistic individuals are prone t...

Descripción completa

Detalles Bibliográficos
Autores principales: Plunk, Abigale, Amat, Ashwaq Zaini, Tauseef, Mahrukh, Peters, Richard Alan, Sarkar, Nilanjan
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2023
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10098747/
https://www.ncbi.nlm.nih.gov/pubmed/37050584
http://dx.doi.org/10.3390/s23073524
_version_ 1785024887995236352
author Plunk, Abigale
Amat, Ashwaq Zaini
Tauseef, Mahrukh
Peters, Richard Alan
Sarkar, Nilanjan
author_facet Plunk, Abigale
Amat, Ashwaq Zaini
Tauseef, Mahrukh
Peters, Richard Alan
Sarkar, Nilanjan
author_sort Plunk, Abigale
collection PubMed
description Adaptive human–computer systems require the recognition of human behavior states to provide real-time feedback to scaffold skill learning. These systems are being researched extensively for intervention and training in individuals with autism spectrum disorder (ASD). Autistic individuals are prone to social communication and behavioral differences that contribute to their high rate of unemployment. Teamwork training, which is beneficial for all people, can be a pivotal step in securing employment for these individuals. To broaden the reach of the training, virtual reality is a good option. However, adaptive virtual reality systems require real-time detection of behavior. Manual labeling of data is time-consuming and resource-intensive, making automated data annotation essential. In this paper, we propose a semi-supervised machine learning method to supplement manual data labeling of multimodal data in a collaborative virtual environment (CVE) used to train teamwork skills. With as little as 2.5% of the data manually labeled, the proposed semi-supervised learning model predicted labels for the remaining unlabeled data with an average accuracy of 81.3%, validating the use of semi-supervised learning to predict human behavior.
format Online
Article
Text
id pubmed-10098747
institution National Center for Biotechnology Information
language English
publishDate 2023
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-100987472023-04-14 Semi-Supervised Behavior Labeling Using Multimodal Data during Virtual Teamwork-Based Collaborative Activities Plunk, Abigale Amat, Ashwaq Zaini Tauseef, Mahrukh Peters, Richard Alan Sarkar, Nilanjan Sensors (Basel) Article Adaptive human–computer systems require the recognition of human behavior states to provide real-time feedback to scaffold skill learning. These systems are being researched extensively for intervention and training in individuals with autism spectrum disorder (ASD). Autistic individuals are prone to social communication and behavioral differences that contribute to their high rate of unemployment. Teamwork training, which is beneficial for all people, can be a pivotal step in securing employment for these individuals. To broaden the reach of the training, virtual reality is a good option. However, adaptive virtual reality systems require real-time detection of behavior. Manual labeling of data is time-consuming and resource-intensive, making automated data annotation essential. In this paper, we propose a semi-supervised machine learning method to supplement manual data labeling of multimodal data in a collaborative virtual environment (CVE) used to train teamwork skills. With as little as 2.5% of the data manually labeled, the proposed semi-supervised learning model predicted labels for the remaining unlabeled data with an average accuracy of 81.3%, validating the use of semi-supervised learning to predict human behavior. MDPI 2023-03-28 /pmc/articles/PMC10098747/ /pubmed/37050584 http://dx.doi.org/10.3390/s23073524 Text en © 2023 by the authors. https://creativecommons.org/licenses/by/4.0/Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
spellingShingle Article
Plunk, Abigale
Amat, Ashwaq Zaini
Tauseef, Mahrukh
Peters, Richard Alan
Sarkar, Nilanjan
Semi-Supervised Behavior Labeling Using Multimodal Data during Virtual Teamwork-Based Collaborative Activities
title Semi-Supervised Behavior Labeling Using Multimodal Data during Virtual Teamwork-Based Collaborative Activities
title_full Semi-Supervised Behavior Labeling Using Multimodal Data during Virtual Teamwork-Based Collaborative Activities
title_fullStr Semi-Supervised Behavior Labeling Using Multimodal Data during Virtual Teamwork-Based Collaborative Activities
title_full_unstemmed Semi-Supervised Behavior Labeling Using Multimodal Data during Virtual Teamwork-Based Collaborative Activities
title_short Semi-Supervised Behavior Labeling Using Multimodal Data during Virtual Teamwork-Based Collaborative Activities
title_sort semi-supervised behavior labeling using multimodal data during virtual teamwork-based collaborative activities
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10098747/
https://www.ncbi.nlm.nih.gov/pubmed/37050584
http://dx.doi.org/10.3390/s23073524
work_keys_str_mv AT plunkabigale semisupervisedbehaviorlabelingusingmultimodaldataduringvirtualteamworkbasedcollaborativeactivities
AT amatashwaqzaini semisupervisedbehaviorlabelingusingmultimodaldataduringvirtualteamworkbasedcollaborativeactivities
AT tauseefmahrukh semisupervisedbehaviorlabelingusingmultimodaldataduringvirtualteamworkbasedcollaborativeactivities
AT petersrichardalan semisupervisedbehaviorlabelingusingmultimodaldataduringvirtualteamworkbasedcollaborativeactivities
AT sarkarnilanjan semisupervisedbehaviorlabelingusingmultimodaldataduringvirtualteamworkbasedcollaborativeactivities