Socio-Assistive Technologies for Children with Autism
Individuals with cognitive, developmental and learning disabilities – in particular, Autism Spectrum Disorders (ASD) - have a significant social communication impairment, a situation that can often lead to social isolation. Reducing the need for formal care services for individuals with ASD using intelligent technologies, could save $90 billion annually (expected to double by 2017) in the US and relieve caregivers of the “free care” they provide at the sacrifice of their own time, social lives, and health. With the increased activity of longitudinal studies that detect early signs of ASD in high-risk infants/toddlers, the role of early intervention in enhancing the development of toddlers has gained great significance in recent years. Traditional methods for Social Skills Training (SST) for individuals with ASD include offline approaches such as social problem solving, pivotal response training, scripting procedures, priming procedures, and self-monitoring, among many others. Pivotal Response Training (PRT) is an evidence-based early behavioral intervention method (recognized as one of the four scientifically based treatments for autism), which teaches learners with ASD functional social-communicative and adaptive behaviors within a naturalistic teaching format.
This project has three complementary research objectives: (1) Design of a real-time immersive cyber-physical system for Social Skills Training, which integrates modules designed for real-time facial expression recognition, haptic delivery of social cues, and sensing of user’s emotional responses through wearable biosensors; (2) Design and development of advanced real-time computational methods to understand a user’s emotional responses, as exhibited implicitly through physiological signals such as skin conductance, temperature, and pulse rate, during social interactions; and (3) Design of an implicit interactive cyber-physical system through decision-theoretic stochastic learning models that adapt the system’s behavior based on the user’s implicit feedback in real-world social contexts. These objectives will achieve the goal of the proposed project to build individualized socio-assistive technologies that can be tailored to the needs and responses of each user through implicit interactive cyber-physical systems. The proposed research will be led by experts in human-centered multimedia computing, computer vision, haptic interfaces and signal processing at the Center for Cognitive Ubiquitous Computing at Arizona State University (ASU), who will collaborate with experts in clinical speech-language pathology at ASU and clinical autism services at the renowned Southwest Autism Research and Resource Center to deliver use-inspired outcomes.