Cargando…

Classification of Children With Autism and Typical Development Using Eye-Tracking Data From Face-to-Face Conversations: Machine Learning Model Development and Performance Evaluation

BACKGROUND: Previous studies have shown promising results in identifying individuals with autism spectrum disorder (ASD) by applying machine learning (ML) to eye-tracking data collected while participants viewed varying images (ie, pictures, videos, and web pages). Although gaze behavior is known to...

Descripción completa

Detalles Bibliográficos
Autores principales: Zhao, Zhong, Tang, Haiming, Zhang, Xiaobin, Qu, Xingda, Hu, Xinyao, Lu, Jianping
Formato: Online Artículo Texto
Lenguaje:English
Publicado: JMIR Publications 2021
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8440949/
https://www.ncbi.nlm.nih.gov/pubmed/34435957
http://dx.doi.org/10.2196/29328
_version_ 1783752774999080960
author Zhao, Zhong
Tang, Haiming
Zhang, Xiaobin
Qu, Xingda
Hu, Xinyao
Lu, Jianping
author_facet Zhao, Zhong
Tang, Haiming
Zhang, Xiaobin
Qu, Xingda
Hu, Xinyao
Lu, Jianping
author_sort Zhao, Zhong
collection PubMed
description BACKGROUND: Previous studies have shown promising results in identifying individuals with autism spectrum disorder (ASD) by applying machine learning (ML) to eye-tracking data collected while participants viewed varying images (ie, pictures, videos, and web pages). Although gaze behavior is known to differ between face-to-face interaction and image-viewing tasks, no study has investigated whether eye-tracking data from face-to-face conversations can also accurately identify individuals with ASD. OBJECTIVE: The objective of this study was to examine whether eye-tracking data from face-to-face conversations could classify children with ASD and typical development (TD). We further investigated whether combining features on visual fixation and length of conversation would achieve better classification performance. METHODS: Eye tracking was performed on children with ASD and TD while they were engaged in face-to-face conversations (including 4 conversational sessions) with an interviewer. By implementing forward feature selection, four ML classifiers were used to determine the maximum classification accuracy and the corresponding features: support vector machine (SVM), linear discriminant analysis, decision tree, and random forest. RESULTS: A maximum classification accuracy of 92.31% was achieved with the SVM classifier by combining features on both visual fixation and session length. The classification accuracy of combined features was higher than that obtained using visual fixation features (maximum classification accuracy 84.62%) or session length (maximum classification accuracy 84.62%) alone. CONCLUSIONS: Eye-tracking data from face-to-face conversations could accurately classify children with ASD and TD, suggesting that ASD might be objectively screened in everyday social interactions. However, these results will need to be validated with a larger sample of individuals with ASD (varying in severity and balanced sex ratio) using data collected from different modalities (eg, eye tracking, kinematic, electroencephalogram, and neuroimaging). In addition, individuals with other clinical conditions (eg, developmental delay and attention deficit hyperactivity disorder) should be included in similar ML studies for detecting ASD.
format Online
Article
Text
id pubmed-8440949
institution National Center for Biotechnology Information
language English
publishDate 2021
publisher JMIR Publications
record_format MEDLINE/PubMed
spelling pubmed-84409492021-09-28 Classification of Children With Autism and Typical Development Using Eye-Tracking Data From Face-to-Face Conversations: Machine Learning Model Development and Performance Evaluation Zhao, Zhong Tang, Haiming Zhang, Xiaobin Qu, Xingda Hu, Xinyao Lu, Jianping J Med Internet Res Original Paper BACKGROUND: Previous studies have shown promising results in identifying individuals with autism spectrum disorder (ASD) by applying machine learning (ML) to eye-tracking data collected while participants viewed varying images (ie, pictures, videos, and web pages). Although gaze behavior is known to differ between face-to-face interaction and image-viewing tasks, no study has investigated whether eye-tracking data from face-to-face conversations can also accurately identify individuals with ASD. OBJECTIVE: The objective of this study was to examine whether eye-tracking data from face-to-face conversations could classify children with ASD and typical development (TD). We further investigated whether combining features on visual fixation and length of conversation would achieve better classification performance. METHODS: Eye tracking was performed on children with ASD and TD while they were engaged in face-to-face conversations (including 4 conversational sessions) with an interviewer. By implementing forward feature selection, four ML classifiers were used to determine the maximum classification accuracy and the corresponding features: support vector machine (SVM), linear discriminant analysis, decision tree, and random forest. RESULTS: A maximum classification accuracy of 92.31% was achieved with the SVM classifier by combining features on both visual fixation and session length. The classification accuracy of combined features was higher than that obtained using visual fixation features (maximum classification accuracy 84.62%) or session length (maximum classification accuracy 84.62%) alone. CONCLUSIONS: Eye-tracking data from face-to-face conversations could accurately classify children with ASD and TD, suggesting that ASD might be objectively screened in everyday social interactions. However, these results will need to be validated with a larger sample of individuals with ASD (varying in severity and balanced sex ratio) using data collected from different modalities (eg, eye tracking, kinematic, electroencephalogram, and neuroimaging). In addition, individuals with other clinical conditions (eg, developmental delay and attention deficit hyperactivity disorder) should be included in similar ML studies for detecting ASD. JMIR Publications 2021-08-26 /pmc/articles/PMC8440949/ /pubmed/34435957 http://dx.doi.org/10.2196/29328 Text en ©Zhong Zhao, Haiming Tang, Xiaobin Zhang, Xingda Qu, Xinyao Hu, Jianping Lu. Originally published in the Journal of Medical Internet Research (https://www.jmir.org), 26.08.2021. https://creativecommons.org/licenses/by/4.0/This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in the Journal of Medical Internet Research, is properly cited. The complete bibliographic information, a link to the original publication on https://www.jmir.org/, as well as this copyright and license information must be included.
spellingShingle Original Paper
Zhao, Zhong
Tang, Haiming
Zhang, Xiaobin
Qu, Xingda
Hu, Xinyao
Lu, Jianping
Classification of Children With Autism and Typical Development Using Eye-Tracking Data From Face-to-Face Conversations: Machine Learning Model Development and Performance Evaluation
title Classification of Children With Autism and Typical Development Using Eye-Tracking Data From Face-to-Face Conversations: Machine Learning Model Development and Performance Evaluation
title_full Classification of Children With Autism and Typical Development Using Eye-Tracking Data From Face-to-Face Conversations: Machine Learning Model Development and Performance Evaluation
title_fullStr Classification of Children With Autism and Typical Development Using Eye-Tracking Data From Face-to-Face Conversations: Machine Learning Model Development and Performance Evaluation
title_full_unstemmed Classification of Children With Autism and Typical Development Using Eye-Tracking Data From Face-to-Face Conversations: Machine Learning Model Development and Performance Evaluation
title_short Classification of Children With Autism and Typical Development Using Eye-Tracking Data From Face-to-Face Conversations: Machine Learning Model Development and Performance Evaluation
title_sort classification of children with autism and typical development using eye-tracking data from face-to-face conversations: machine learning model development and performance evaluation
topic Original Paper
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8440949/
https://www.ncbi.nlm.nih.gov/pubmed/34435957
http://dx.doi.org/10.2196/29328
work_keys_str_mv AT zhaozhong classificationofchildrenwithautismandtypicaldevelopmentusingeyetrackingdatafromfacetofaceconversationsmachinelearningmodeldevelopmentandperformanceevaluation
AT tanghaiming classificationofchildrenwithautismandtypicaldevelopmentusingeyetrackingdatafromfacetofaceconversationsmachinelearningmodeldevelopmentandperformanceevaluation
AT zhangxiaobin classificationofchildrenwithautismandtypicaldevelopmentusingeyetrackingdatafromfacetofaceconversationsmachinelearningmodeldevelopmentandperformanceevaluation
AT quxingda classificationofchildrenwithautismandtypicaldevelopmentusingeyetrackingdatafromfacetofaceconversationsmachinelearningmodeldevelopmentandperformanceevaluation
AT huxinyao classificationofchildrenwithautismandtypicaldevelopmentusingeyetrackingdatafromfacetofaceconversationsmachinelearningmodeldevelopmentandperformanceevaluation
AT lujianping classificationofchildrenwithautismandtypicaldevelopmentusingeyetrackingdatafromfacetofaceconversationsmachinelearningmodeldevelopmentandperformanceevaluation