Google researchers are using imitation learning to teach autonomous robots how to pace, spin, and move in more agile ways.
What they did: Using a data set of motion capture data recorded from various sensors attached to a dog, the researchers taught a quadruped robot named Laikago several different movements that are hard to achieve through traditional hand-coded robotic controls.
How they did it: First, they used the motion data from the real dog to construct simulations of each maneuver, including a dog trot, side-step, and … a dog version of classic ’80s dance move, the running man. (The last one was not, in fact, performed by the real dog itself. The researchers manually animated the simulated dog to dance to see if that would translate to the robot as well.) They then matched together key joints on the simulated dog and the robot to make the simulated robot move in the exact same way as the animal. Using reinforcement learning, it then learned to stabilize the movements and correct for differences in weight distribution and design. Finally, the researchers were able to port the final control algorithm into a physical robot in the lab—though some moves, like the running man, weren’t entirely successful.
Why it matters: Teaching robots the complex and agile movements necessary to navigate the real world has been a long-standing challenge in the field. Imitation learning of this kind instead allows such machines to easily borrow the agility of animals and even humans.
Future work: Jason Peng, the lead author on the paper, says there are still a number of challenges to overcome. The heaviness of the robot limits its ability to learn certain maneuvers, like big jumps or fast running. Additionally, capturing motion sensor data from animals isn’t always possible. It can be incredibly expensive and requires the animal’s cooperation. (A dog is friendly; a cheetah, not so much.) The team plans to try using animal videos instead, which would make their technique far more accessible and scalable.
To have more stories like this delivered directly to your inbox, sign up for our Webby-nominated AI newsletter The Algorithm. It's free.
A Roomba recorded a woman on the toilet. How did screenshots end up on Facebook?
Robot vacuum companies say your images are safe, but a sprawling global supply chain for data from our devices creates risk.
The viral AI avatar app Lensa undressed me—without my consent
My avatars were cartoonishly pornified, while my male colleagues got to be astronauts, explorers, and inventors.
Roomba testers feel misled after intimate images ended up on Facebook
An MIT Technology Review investigation recently revealed how images of a minor and a tester on the toilet ended up on social media. iRobot said it had consent to collect this kind of data from inside homes—but participants say otherwise.
How to spot AI-generated text
The internet is increasingly awash with text written by AI software. We need new tools to detect it.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.