— This robot's leg movements have a crucial effect on the information received by its eyes (Image: Olaf Sporns/Max Lungarella)
Experiments involving real and simulated robots suggest that the relationship between physical movement and sensory input could be crucial to developing more intelligent machines.
Tests involving two real and one simulated robot show that feedback between sensory input and body movement is crucial to navigating the surrounding world. Understanding this relationship better could help scientists build more life-like machines, say the researchers involved.
Scientists studying artificial intelligence have traditionally separated physical behaviour and sensory input. "But the brain's inputs are not independent," says Olaf Sporns, a neuroscientist at Indiana University, US. "For example, motor behaviour has a role to play in what the body senses from the environment."
An increasing number of researchers are taking this approach, known as "embodied cognition", says Sporns. He worked with roboticist Max Lungarella from Tokyo University in Japan, to create experiments that would test the idea.
They used a four-legged walking robot, a humanoid torso and a simulated wheeled robot. All three robots had a computer vision system trained to focus on red objects. The walking and wheeled robots automatically move towards red blocks in their proximity, while the humanoid bot grasps red objects, moving them closer to its eyes and tilting its head for a better view.
To measure the relationship between movement and vision the researchers recorded information from the robots' joints and field of vision. They then used a mathematical technique to see how much of a causal relationship existed between sensory input and motor activity.
"We saw causation of both kinds," Sporns says. "Information flows from sensory events to motor events and also from motor events to sensory events." It is an important experimental demonstration of this aspect of embodied cognition, he claims: "This work and that of others is now making it more practical and less of a metaphor."
Similar experiments ought to show the same relationship in animals, he adds, as evolution has produced bodies and brains that work together to understand the world. Such tests would be much harder to carry out, but Sporns says researchers are starting to investigate how it might be done.
The experiments could suggest a better way to design and build robots, Sporns adds. Maximising information flow between sensory and motor systems could produce more flexible, capable systems, he says. Experiments involving more simulated robots, "evolved" using genetic algorithms, suggest this to be a promising approach, he says.
Daniel Polani, who researches artificial intelligence at Hertfordshire University in the UK, also sees promise. "Using similar approaches, it should be possible to produce more efficient cognitive systems, like those in nature, without specialising on a particular task" such as movement or vision, he told New Scientist.
Aaron Sloman, another artificial intelligence researcher, at Birmingham University in the UK, says interaction with the environment is vital to intelligence. But he also points out that the human brain is capable of working with concepts not grounded in the physical world.
"That is why you can think about a person in Birmingham whom you have never met," he says, "How does an architect design a new skyscraper, long before there is anything to see, feel, touch, or manipulate?"
Journal reference PLoS Computational Biology (DOI: 10.1371/journal.pcbi.0020144)