Menu

Automated Eye Contact Tracking Technology for Autism

Author: Georgia Tech
Published: 2012/09/26 - Updated: 2026/01/28
Publication Details: Peer-Reviewed, Product Release, Update
Category Topic: Autism - Related Publications

Contents: Synopsis - Introduction - Main - Insights, Updates

Synopsis: This peer-reviewed research from Georgia Tech's Center for Behavior Imaging presents automated technologies that measure behavioral markers in children with autism spectrum disorder. The study describes two innovations: gaze-tracking glasses combined with facial-analysis software that detect eye contact with 80 percent accuracy, and wearable accelerometer systems that identify and classify problem behaviors with up to 95 percent accuracy. The research is authoritative because it underwent peer review, was funded by a $10 million National Science Foundation grant, and involved collaboration between Georgia Tech researchers and the Marcus Autism Center. These technologies matter for families and clinicians because they replace labor-intensive manual video analysis with automated detection systems, potentially enabling large-scale autism screening and real-time behavioral monitoring in homes and schools rather than just clinical settings - Disabled World (DW).

Definition: Autism Spectrum Disorder (ASD)

Autism, or autism spectrum disorder (ASD), is a lifelong, nonprogressive neurological disorder typically appearing before the age of three years. The word "autism" means a developmental disability is significantly affecting verbal and non-verbal communication and social interaction. People with ASD may behave, communicate, interact, and learn in ways that are different from most others. Often, nothing about how they look sets them apart from other people. The classic form of autism involves a triad of impairments; in social interaction, communication, language use, and limited imagination as reflected in restricted, repetitive, and stereotyped behavior patterns and activities.

Introduction

Researchers in Georgia Tech's Center for Behavior Imaging have developed two new technological tools that automatically measure relevant behaviors of children, and promise to have significant impact on the understanding of behavioral disorders such as autism.

One of the tools - a system that uses special gaze-tracking glasses and facial-analysis software to identify when a child makes eye contact with the glasses-wearer - was created by combining two existing technologies to develop a novel capability of automatic detection of eye contact. The other is a wearable system that uses accelerometers to monitor and categorize problem behaviors in children with behavioral disorders.

Both technologies already are being deployed in the Center for Behavior Imaging's (CBI) ongoing work to apply computational methods to screening, measurement and understanding of autism and other behavioral disorders.

Main Content

Children at risk for autism often display distinct behavioral markers from a very young age. One such marker is a reluctance to make frequent or prolonged eye contact with other people. Discovering an automated way to detect this and other telltale behavioral markers would be a significant step toward scaling autism screening up to much larger populations than are currently reached. This is one goal of the five-year, $10 million "Expeditions" project, funded in the fall of 2010 by the National Science Foundation under principal investigator and CBI Director Jim Rehg, also a professor in Georgia Tech's School of Interactive Computing.

The eye-contact tracking system begins with a commercially available pair of glasses that can record the focal point of their wearer's gaze.

Researchers took a video of a child captured by a front-facing camera on the glasses worn by an adult interacting with the child. The video was then processed using facial recognition software available from a second manufacturer. Combine the glasses' hard-wired ability to detect the wearer's gaze with the facial-recognition software's ability to detect the child's gaze direction, and the result is a system able to detect eye contact in a test interaction with a 22-month-old with 80 percent accuracy. The study was conducted in Georgia Tech's Child Study Lab (CSL), a child-friendly experimental facility richly equipped with cameras, microphones, and other sensors.

"Eye gaze has been a tricky thing to measure in laboratory settings, and typically it's very labor-intensive, involving hours and hours of looking at frames of video to pinpoint moments of eye contact," Rehg said. "The exciting thing about our method is that it can produce these measures automatically and could be used in the future to measure eye contact outside the laboratory setting. We call these results preliminary because they were obtained from a single subject. Still, all humans' eyes work the same way, so we're confident the successful results will be replicated with future subjects."

The other new system, developed in collaboration with the Marcus Autism Center in Atlanta and Dr. Thomas Ploetz of Newcastle University in the United Kingdom, is a package of sensors worn via straps on the wrists and ankles that uses accelerometers to detect movement by the wearer. Algorithms developed by the team analyze the sensor data to automatically detect episodes of problem behavior and classify them as aggressive, self-injurious, or disruptive (e.g., throwing objects).

Researchers first developed the algorithms by putting the sensors on four Marcus clinic staff members who performed 1,200 different behavior instances together. The system detected "problem" behaviors with 95 percent accuracy. It classified all behaviors with 80 percent accuracy. They then used the sensors with a child diagnosed along the autism spectrum, and the system detected the child's problem-behavior episodes with 81 percent accuracy and classified them with 70 percent accuracy.

"These results are auspicious in leading the way toward more accurate and reliable measurement of problem behavior, which is important in determining whether treatments targeting these behaviors are working," said CSL Director Agata Rozga, a research scientist in the School of Interactive Computing and co-investigator on the Expeditions award. "Our ultimate goal with this wearable sensing system is to gather data on the child's behavior beyond the clinic, in settings where the child spends most of their time, such as their home or school. In this way, parents, teachers, and others who care for the child can be potentially alerted to times and situations when problem behaviors occur so that they can address them immediately."

"What these tools show is that computational methods and technologies have great promise and potential impact on the lives of many children and their parents and caregivers," said Gregory Abowd, Regents' Professor in the School of Interactive Computing and a prominent researcher in technology and autism. "These technologies we are developing, and others developed and explored elsewhere, aim to bring more effective early-childhood screening to millions of children nationwide, as well as enhance care for those already diagnosed on the autism spectrum."

Both technologies were presented in early September at the 14th ACM International Conference on Ubiquitous Computing (Ubicomp 2012). Among the other devices under study at CSL is a camera/software system that can track children's facial expressions and customized speech analysis software to detect vocalization patterns.

Insights, Analysis, and Developments

Editorial Note: The convergence of wearable sensors, machine vision, and behavioral science represents a meaningful shift in how autism spectrum disorder can be identified and monitored. While these technologies showed promising accuracy rates in controlled studies, their true value will emerge when they move beyond laboratory walls into everyday environments where children actually live and interact. The ability to gather behavioral data continuously in natural settings could fundamentally change how parents, educators, and clinicians understand individual patterns and respond to behavioral challenges in real time. As computational methods continue advancing, the gap between clinical assessment and daily support may finally begin to close, offering families practical tools rather than just diagnostic labels - Disabled World (DW).

Attribution/Source(s): This peer reviewed publication was selected for publishing by the editors of Disabled World (DW) due to its relevance to the disability community. Originally authored by Georgia Tech and published on 2012/09/26, this content may have been edited for style, clarity, or brevity.

Related Publications

Face to Face with Autism Facial Expressions: Researchers study facial expressions in a real-life social context to explore emotion recognition in autism.

Study Questions Eye Contact as Autism Marker, Suggests New Approaches: New research challenges the assumption that reduced eye contact defines autism, suggesting natural play behaviors offer better diagnostic and intervention insights.

Visual Pattern Preference May be Sign of Autism in Children: Preference for geometric patterns early in life may be a signature behavior in infants who are at-risk for autism.

Relatives of Autistics Display Abnormal Eye Movements: Abnormal eye movements and other impairments appear common in unaffected family members of individuals with autism.

Autism & Beyond: Autism Facial Expressions App: Autism & Beyond App screens for autism by reading facial expressions of children for emotional cues.

Detecting Autism in Children with an Eye Test: Measuring how a child's pupils change in response to light could potentially be used to screen for autism spectrum disorders (ASD).

: Should the autism spectrum be split apart? a critical examination of nosological unity and diagnostic heterogeneity.

: New research shows autistic adults face 1.5x higher hospital readmission rates for mental health conditions, revealing critical gaps in accessible care.

: Study finds no increased spatial working memory decline in older adults with autistic traits compared to neurotypical peers, offering reassurance about cognitive aging.

What People Are Saying

Join thought-provoking conversations from other Disabled World readers and see their replies.

▶ Share and Comment

APA: Georgia Tech. (2012, September 26 - Last revised: 2026, January 28). Automated Eye Contact Tracking Technology for Autism. Disabled World (DW). Retrieved March 13, 2026 from www.disabled-world.com/health/neurology/autism/high-tech.php
MLA: Georgia Tech. "Automated Eye Contact Tracking Technology for Autism." Disabled World (DW), 26 Sep. 2012, revised 28 Jan. 2026. Web. 13 Mar. 2026. <www.disabled-world.com/health/neurology/autism/high-tech.php>.
Chicago: Georgia Tech. "Automated Eye Contact Tracking Technology for Autism." Disabled World (DW). Last modified January 28, 2026. www.disabled-world.com/health/neurology/autism/high-tech.php.

While we strive to provide accurate, up-to-date information, our content is for general informational purposes only. Please consult qualified professionals for advice specific to your situation.