Scaz probes humanity, robots

Bright yellow, with a cartoonish nose and pair of eyes and standing about 4 inches tall above its base, Keepon looks like a cross between a marshmallow Peep and a snowman. And Keepon has become a YouTube sensation — because this robot can dance.

Keepon can improvise to any song with a reasonable beat, said Brian “Scaz” Scassellati, a computer science professor who received tenure last week, exposing the robot’s inner workings while putting him through his paces. Partially built by former undergraduate Marek Michalowski ’02 GRD ’03 and imported from Japan, Keepon helps Scaz and his assistants — four graduate students and about seven undergraduates — study when and why humans identify with object motion.

Computer science professor Brian Scassellati designed and built a robot with the intelligence of a 6-month-old infant. The robot was developed to interact with autistic children.
Murphy Temple
Computer science professor Brian Scassellati designed and built a robot with the intelligence of a 6-month-old infant. The robot was developed to interact with autistic children.

Scassellati splits his time between Keepon and two other robots: Pleo, a cat-sized dinosaur that can be programmed to act out an endless variety of movements and behaviors; and Nico, a metallic replica of the upper body of a 1-year-old child made from motor cables and metal piping.

“The central focus of a lot of [my work] is understanding how kids are able to learn effectively from other people — from their parents and from their peers,” Scassellati said. “We use robots as a way to both model this and to explore the boundaries of human social skills.”

SIMULATING HUMANITY

Scaz’s team is probing a complicated question: What makes ascribe humans qualities to Keepon?

“It’s a blob that moves around,” Scassellati pointed out. But when Keepon starts to bop its “head” to pop music, many intuitively perceive it to be dancing.

“It really challenges you to define the boundaries of what you’re willing to attribute to something,” he said.

For his part, the robotic torso Nico hardly looks human. But since Nico’s function is not to interact with others but rather to simulate a developing child, Scassellati said it is his proportions and body structure that matter.

The way in which humans interact with the world is based in part on their body structure; since, for example, children and adults have proportionally different head sizes and arm lengths, they have different capabilities.

Nico has been used to model, among other things, the way in which children develop hand-eye coordination and learn about their senses.

Recently, Scassellati and Justin Hart GRD ’12, a lab assistant and teaching assistant, were able to disprove the commonly accepted theory that children can identify the point at which their eyes focus.

Since its inception, humanoid robot development has been based on psychological or cognitive explanations of how humans think, Scassellati said.

“We turned to child development, using models from psychology, to tell us about how we might be able to build things,” he said. “But we’re also able to give back a little bit now, in terms of being able to evaluate these models and see which ones are actually coherent and which ones do what they say they do.”

WHEN ROBOTS GET SOCIAL

Pleo the dinosaur is a recently discontinued commercial toy that the team is using to research their biggest project: autism.

With different personalities and behaviors handily programmed onto standard, interchangeable SD memory cards, a fleet of Pleo clones are ideal playmates for children, motivating even those with autism to interact with them.

Meanwhile, researchers can step back and focus on a child’s interactions.

“Running an experiment is pretty much always fun,” said Elaine Short ’10, who works at Scassellati’s lab and takes one of his classes, “Intelligent Robotics.”

After documenting interactions between Pleo and non-autistic children, Scassellati’s team models social behaviors to autistic children which their disability prevents them from successfully learning. In particular, his current focus is on prosody, or the intonation of one’s voice.

“It is what you say,” Scassellati said, explaining the concept. “It’s how you say it.”.

He and his assistants have been working closely with autistic children at the Yale Child Studies Center Developmental Disabilities Clinic, programming Pleo to hesitate at a river drawn on a play mat and modeling the common “encouragement” prosody for the children.

“We want to see them take that behavior and use it with another child, or with their parents, or with another adult,” he said. “[But] that’s something we haven’t seen yet.”

Scassellati said he is serious about his work on autism, but does not want to oversell robots as a miracle remedy. He said he gets about one call a week from the media, and almost invariably turns them down because of the highly experimental nature of his work.

“Autism is a devastating disorder,” he says soberly. “It really impacts the whole family in a very fundamental way. It’s also an area where there are a lot of fads — I don’t want to be a fad out there. I don’t want a headline that says, ‘Robots Cure Autism.’ ”

Despite Scassellati’s tendency to shy away from media attention, his work has always been on the cutting edge.

“I knew who he was long before I applied,” said Hart, a lab assistant. “He was in Popular Science when I was in high school, because his thesis project was hot stuff.”

That project was “COG” — a full-sized model of a human upper torso that pioneered social and developmental behaviors. In time, COG learned to play with a Slinky, determine whether people were looking at it, and discriminate between human and nonhuman movements in its environment.

“It used to be that robots were these industrial machines that sat on factory automation floors,” Scassellati said, adding, “It’s no longer the case that we can just put yellow safety tape around them and forget about them.”

Comments

  • Anonymous

    I appreciate the plug for my research, but I feel like I should clarify. Prof. Scassellati and I, in along with Prof. Steve Zucker and undergraduates David Golub and Eleanor Avrunin are studying how to enable robots to understand the structure of their bodies, including their senses in this model.

    We haven't published any results disproving anything, yet, but we are fiddling around with a model of how infants learn the focal length of their eye (which is referred to in the article). What we're doing doesn't overtly disagree with psychology, but it will provide a proof of concept that could clarify matters, and my current analysis of the matter disagrees with what I first thought of when I heard of this theory. I'm not sure that a treatment of the matter has ever been so detailed as to really disprove it, and what we're doing is somewhat related. I guess a better way to phrase it would be that it changes the view, though I think I might have said "disprove" during our chat. I'd rather not further comment, since the result is both unfinished and unpublished.

    In a broader context, what we've enabled Nico to do so far is to continue to see in 3D after his eyes move by modeling how they move while looking through them. We believe that infants start out with little or no concept of how their bodies work, and learn this through a process of self-exploration, which we are trying to simulate on Nico.

    Anyway, thanks for the wonderful write-up on my advisor. I hope that my comment wasn't too nit-picky!