A View from Christopher Mims
How Babies Know What Robots Are Thinking
New research tells us something about infants’ theory of mind, as well as how to build robots humans instinctively recognize as sentient
Computer scientists don’t usually see their labs filling up with dozens of mothers and their infants, but that’s exactly what happened to Rajesh Rao as he embarked on one of his most recent experiments. In order to discover what it takes to make an infant engage with a robot as if it were a sentient being, he had to get his hands on the real thing.
At one year of age, infants typically begin to follow the gaze of the adults in their line of sight. It’s a useful way to recognize what’s important and discern which words attach to which objects. Indeed, there’s even a theory in evolutionary biology that holds that the highly visible whites of humans’ eyes may have evolved in order to facilitate gaze following as an important mechanism of social interaction.
In order to test whether or not gaze-following is important for discerning the sentience of an artificial being, Rao allowed babies to watch adults interact with HOAP-2, a humanoid robot from Fujitsu Laboratories. In four different conditions, he tested “normal interaction”, when the adult followed the gaze of the robot, and “passive conditions”, when the robot did nothing or the gaze of the adult and the robot did not sync.
When adults interacted with the robot by following its gaze as if were another adult, babies subsequently followed the robot’s gaze. The robot did not talk and had a limited range of gestures (some of which it used in other experimental conditions), which suggests that gaze following is a unique identifier for babies – and humans – about the mental capacity of otherwise “inanimate” beings.
The research both provides a uniquely-controlled method for picking apart which features of a humanoid tell an infant that it has some level of awareness, and suggest that for social robots to interact with humans in a natural manner, gaze following must be a part of their repertoire of interactions.
Neural Networks, October/November 2010
Couldn't make it to EmTech Next to meet experts in AI, Robotics and the Economy?Go behind the scenes and check out our video