Stratification of Children with Autism Spectrum Disorder Through Fusion of Temporal Information in Eye-Gaze Scan Paths

Document Type

Article

Publication Date

2023

DOI

10.1145/3539226

Publication Title

ACM Transactions on Knowledge Discovery from Data

Volume

17

Issue

2

Pages

25 (1-20)

Abstract

Background: Looking pattern differences are shown to separate individuals with Autism Spectrum Disorder (ASD) and Typically Developing (TD) controls. Recent studies have shown that, in children with ASD, these patterns change with intellectual and social impairments, suggesting that patterns of social attention provide indices of clinically meaningful variation in ASD.

Method: We conducted a naturalistic study of children with ASD (n = 55) and typical development (TD, n = 32). A battery of eye-tracking video stimuli was used in the study, including Activity Monitoring (AM), Social Referencing (SR), Theory of Mind (ToM), and Dyadic Bid (DB) tasks. This work reports on the feasibility of spatial and spatiotemporal scanpaths generated from eye-gaze patterns of these paradigms in stratifying ASD and TD groups.

Algorithm: This article presents an approach for automatically identifying clinically meaningful information contained within the raw eye-tracking data of children with ASD and TD. The proposed mechanism utilizes combinations of eye-gaze scan-paths (spatial information), fused with temporal information and pupil velocity data and Convolutional Neural Network (CNN) for stratification of diagnosis (ASD or TD).

Results: Spatial eye-gaze representations in the form of scanpaths in stratifying ASD and TD (ASD vs. TD: DNN: 74.4%) are feasible. These spatial eye-gaze features, e.g., scan-paths, are shown to be sensitive to factors mediating heterogeneity in ASD: age (ASD: 2–4 y/old vs. 10–17 y/old CNN: 80.5%), gender (Male vs. Female ASD: DNN: 78.0%) and the mixture of age and gender (5–9 y/old Male vs. 5–9 y/old Female ASD: DNN:98.8%). Limiting scan-path representations temporally increased variance in stratification performance, attesting to the importance of the temporal dimension of eye-gaze data. Spatio-Temporal scan-paths that incorporate velocity of eye movement in their images of eye-gaze are shown to outperform other feature representation methods achieving classification accuracy of 80.25%.

Conclusion: The results indicate the feasibility of scan-path images to stratify ASD and TD diagnosis in children of varying ages and gender. Infusion of temporal information and velocity data improves the classification performance of our deep learning models. Such novel velocity fused spatio-temporal scan-path features are shown to be able to capture eye gaze patterns that reflect age, gender, and the mixed effect of age and gender, factors that are associated with heterogeneity in ASD and difficulty in identifying robust biomarkers for ASD.

Rights

© 2023 Association for Computing Machinery

Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from permissions@acm.org.

Original Publication Citation

Atyabi, A., Shic, F., Jiang, J., Foster, C. E., Barney, E., Kim, M., Li, B., Ventola, P., & Chen, C. H. (2023). Stratification of children with Autism Spectrum Disorder through fusion of temporal information in eye-gaze scan-paths. ACM Transactions on Knowledge Discovery from Data, 17(2), 1-20, Article 25. https://doi.org/10.1145/3539226

Share

COinS