This primitive-seeming architecture, wrote Brooks, was the key to someday building artificially intelligent robots: Parts before smarts.
Brooks's insight paved the way for Boston Dynamics' lifelike robots, as well as Brooks's own iRobot corporation (which manufactures Roombas and bomb-defusing robots for the military). And yet a truly intelligent robot – with parts and smarts equivalent even to that of a domestic dog – has yet to be built. Why? Not because situated AI turned out to be yet another dead end, but because it addressed a newer, harder problem, known as Moravec's Paradox. "It is comparatively easy to make computers exhibit adult level performance on intelligence tests or playing checkers, and difficult or impossible to give them the skills of a one-year-old [human] when it comes to perception and mobility," roboticist Hans Moravec wrote in 1988.
So how can we solve Moravec's Paradox? One approach is to take the assumptions of situated AI to their logical endpoint: If we want to build a robot with human-like intelligence, first build a robot with humanlike anatomy. A team of European researchers has done just that: their ECCERobot (Embodied Cognition in a Compliantly Engineered Robot) has a thermoplastic skeleton complete with vertebrae, phalanges, and a ribcage. Instead of rigid motors, it has muscle-like actuators and rubber tendons. It has as many degrees of movement as a human torso; it flops into a heap when its power is turned off, just like an unconscious human would. And most importantly, all of these parts are studded with sensors.
"The patterns of sensory stimulation that we generate from moving our bodies in space and interacting with our environment are the basic building blocks of cognition," says Rolf Pfeifer, a lead researcher on ECCERobot. "When I grasp a cup, I am inducing sensory stimulation in the hand; in my eyes, from seeing how the scene changes; and proprioceptively [in my muscles], since I can feel its weight.”
These sensory patterns are the raw material for the brain to learn something about the environment and how to make distinctions in the real world, says Pfeifer, and these patterns depend strongly on the particular actions we perform with our particular body parts. “So if we want the robot to acquire the same concepts that we do,” he says, “it would have to start by generating the same sensory patterns that we do, which implies that it would need to have the same body plan as we do."
For now, ECCERobot's humanoid physiology is so difficult to control that it can barely pick up an object, much less exhibit intelligent behaviour. But Pfeifer and his team aren't the only ones exploring this "anthropomimetic" strategy: Boston Dynamics, the same firm that created Big Dog, is working with DARPA, the US military's research wing, to develop a humanoid robot called ATLAS which will "use the arms in conjunction with the legs to get higher levels of rough-terrain locomotion," says Raibert.
In any case, says Pfeifer, building an intelligent humanoid robot – one that "can smoothly interact with humans and human environments in a natural way" – will require breakthroughs in computing and battery efficiency, not to mention a quantum leap in sensory equipment. "A really crucial development will be skin," he says. "Skin is extremely important in the development of intelligence because it provides such rich sensory patterns: touch, temperature, pain, all at once."
A robot with skin and human-like internal anatomy starts to sound less like a robot at all, and more like a synthetic organism – much like David in Prometheus. Which takes us back to the question he asks in the film. Or as Pfeifer more pragmatically puts it: "Why build a robot which is a very fragile and expensive copy of a human being?"
It is a very useful goal, Pfeifer argues. “Even if we still mostly want robots to do specialized tasks, there will be tons of spinoffs from an understanding of humanoid, intelligent behaviour. Yes, we'll draw inspiration from biology. But that doesn't imply that we won't go beyond it."