Yale researchers may have found a new way to detect autism in infants as young as six months old.

After tracking eye movements of six-month-old infants presented with three different types of faces, the investigators discovered that the young children with autistic siblings — who are roughly ten times more likely to develop autism themselves — paid less attention to key facial features than low-risk infants did, but only when the face shown was speaking. While further research is needed to determine the causes of the particular looking patterns, the finding may help doctors diagnose autism before symptoms appear at around two years of age, said Katarzyna Chawarska GRD ’00, a professor in the Child Study Center and study co-author.

In the study, which was published in the February issue of the journal Biological Psychiatry, infants were shown a static picture of a woman, a video of a woman smiling and a video of a woman smiling while speaking a nursery rhyme. The study’s first discovery — that high-risk infants paid less attention to faces in general than low-risk infants — was not a surprise, as previous research has demonstrated that autistic individuals spend less time looking at faces and social cues, said Frederick Shic, a professor in the Child Study Center and study lead author. What was surprising, he said, was the second discovery that infants who later developed ASD turned away from the inner-face features of speaking faces only. These infants focused instead on outer features, which may be indicative of their risk for developing ASD, Shic said.

“The inner areas of the face contain important facts about a person’s identity, emotions and cognitive state,” he said. “These six-month-old infants, if not looking at these inner areas, aren’t looking at these critically important areas of the face.”

The challenge now is for researchers to determine why speech was the only condition that increased high-risk infants’ attention to outer features. There are two possible explanations, Shic said: first, that the woman’s speech was causing the infants to not want to look at the inner features, or secondly, that the speech was causing infants to become confused about where to place their attention.

Of the two, Shic said that the second is more likely. High-risk infants might suffer from a sort of “attentional confusion,” he said, as children with autism do not always seem to be able to process simultaneous speech and visual tasks as efficiently as normally developing children.

While these findings look promising for helping to diagnose autism, Shic said, the study must be replicated with a larger sample size before the clinical advances become apparent. At 122 infants, the sample size for this experiment was small, and the technology utilized to track infants’ attention may not be feasible for widespread use, said Suzanne Macari, a research scientist at the Yale Child Study Center and study co-author.

“We’ve been finding these atypicalities using very sensitive eye-tracking instruments,” Macari said. “Even really great clinicians are not able to pick up these differences when interacting with infants. So in terms of reaching application, I’m not sure if we’re there yet.”

About one in 88 children are affected by ASD, according to the Centers for Disease Control and Prevention.

VIVIAN WANG