Real-Time Simultaneous Recurrent Neural Network Implementation for Robot-Mediated Intervention in Autism Spectrum Disorders

Location

Old Dominion University, Learning Commons at Perry Library, West Foyer

Start Date

4-8-2017 8:30 AM

End Date

4-8-2017 10:00 AM

Description

Children with Autism Spectrum Disorder (ASD) face challenges in social communication and interaction that may be helped through computer-based intervention. Earlier studies report oddity as lack of natural traits in facial expressions of affected individuals. This project develops a novel hardware-software platform intended for use in future intervention paradigms targeting facial oddity biomarkers. Real-time recognition of the basic facial expressions—happy, sad, anger, fear, surprise, and disgust—plus neutral is achieved using a deep artificial neural network modeled using the Tensorflow machine intelligence library. The network is trained and tested on the Extended Cohn-Kanade (CK+) facial expression dataset, then deployed in a real-time software application that processes and labels facial expressions frame-by-frame from live video feed. The application is written and interfaced with the NAO robot using the Python programming language. The resulting hardware-software platform is planned for future studies of the efficacy of robot-mediated intervention in children with ASD.

Presentation Type

Poster

This document is currently not available here.

Share

COinS
 
Apr 8th, 8:30 AM Apr 8th, 10:00 AM

Real-Time Simultaneous Recurrent Neural Network Implementation for Robot-Mediated Intervention in Autism Spectrum Disorders

Old Dominion University, Learning Commons at Perry Library, West Foyer

Children with Autism Spectrum Disorder (ASD) face challenges in social communication and interaction that may be helped through computer-based intervention. Earlier studies report oddity as lack of natural traits in facial expressions of affected individuals. This project develops a novel hardware-software platform intended for use in future intervention paradigms targeting facial oddity biomarkers. Real-time recognition of the basic facial expressions—happy, sad, anger, fear, surprise, and disgust—plus neutral is achieved using a deep artificial neural network modeled using the Tensorflow machine intelligence library. The network is trained and tested on the Extended Cohn-Kanade (CK+) facial expression dataset, then deployed in a real-time software application that processes and labels facial expressions frame-by-frame from live video feed. The application is written and interfaced with the NAO robot using the Python programming language. The resulting hardware-software platform is planned for future studies of the efficacy of robot-mediated intervention in children with ASD.