Publication

Technology for Just-In-Time In-Situ Learning of Facial Affect for Persons Diagnosed with an Autism Spectrum Disorder

M. Madsen, Rana Kaliouby, Matthew Goodwin, Rosalind W. Picard

Abstract

Many first-hand accounts from individuals diagnosed with autism spectrum disorders (ASD) highlight the challenges inherent in processing high-speed, complex, and unpredictable social information such as facial expressions in real-time. In this paper, we describe a new technology aimed at helping people capture, analyze, and reflect on a set of social-emotional signals communicated by facial and head movements in live social interaction that occurs with their everyday social companions. We describe our development of a new combination of hardware using a miniature camera connected to an ultramobile PC together with custom software developed to track, capture, interpret, and intuitively present various interpretations of the facial-head movements (e.g., presenting that there is a high probability the person looks “confused”). This paper describes this new technology together with the results of a series of pilot studies conducted with adolescents diagnosed with ASD who used the technology in their peer-group setting and contributed to its development via their feedback.

Related Content