Skip to Content

An Ostrich-Like Robot Pushes the Limits of Legged Locomotion

Robots are still learning to walk. Here’s one that runs on two legs.

What looks like a tiny mechanical ostrich chasing after a car is actually a significant leap forward for robot-kind.

The clever and simple two-legged robot, known as the Planar Elliptical Runner, was developed at the Institute for Human and Machine Cognition in Pensacola, Florida, to explore how mechanical design can be used to enable sophisticated legged locomotion. A video produced by the researchers shows the robot being tested in a number of situations, including on a treadmill and running behind and alongside a car with a helping hand from an engineer.

In contrast to many other legged robots, this one doesn’t use sensors and a computer to help balance itself. Instead, its mechanical design provides dynamic stability as it runs. “All the intelligence is in the physical design of the robot itself,” says Jerry Pratt, a senior research scientist at IHMC who leads the team that developed the robot. Pratt's group at IHMC is working on a range of different robots.

The design might feed into future systems. “We believe that the lessons learned from this robot can be applied to more practical running robots to make them more efficient and natural looking,” Pratt adds. “Running will be eventually useful for any application that you want to do quickly and where wheels can't work well.”

Pratt previously led a team that participated in the DARPA Robotics Challenge, a contest that saw robots try to perform a series of tasks in an environment designed to simulate a nuclear disaster. The challenge showcased some spectacular technology, but numerous mishaps and falls also highlighted the difficulties robots face dealing with unfamiliar, real-world situations. Many of the robots involved in that challenge used two legs, but some were unable to walk over sand or uneven ground (see “Why Robots and Humans Struggled with DARPA’s Challenge”). 

The Planar Elliptical Runner has a single motor that drives the legs, and the elliptical motion of its legs together with its body shape provide inherent stability. The robot runs at 10 miles per hour, but if it were the size of a human it would travel at 20 to 30 miles an hour, the researchers say.

There is a small but growing interest in finding commercial uses for legged robots.

Boston Dynamics, a prominent robot maker owned by Alphabet, has shown two- and four-legged systems carrying boxes around warehouses and delivering packages outdoors. A company spun out of the University of Oregon by Jonathan Hurst, a professor of mechanical engineering, has developed another ostrich-inspired system called Cassie. Researchers at the University of Michigan led by Jessie Grizzle, a professor there, are developing advanced algorithms that allow for more efficient and graceful dynamic locomotion. Machines that balance themselves dynamically can traverse difficult terrain, but they are complex, expensive, and power-hungry.

“Robots with legs will be particularly useful in places where you want a human presence, but it's too dangerous, expensive, or remote to send a real human,” Pratt says. “Examples include nuclear power plant decommissioning and planetary exploration. These are very small, niche, markets, though.”

Keep Reading

Most Popular

Large language models can do jaw-dropping things. But nobody knows exactly why.

And that's a problem. Figuring it out is one of the biggest scientific puzzles of our time and a crucial step towards controlling more powerful future models.

OpenAI teases an amazing new generative video model called Sora

The firm is sharing Sora with a small group of safety testers but the rest of us will have to wait to learn more.

The problem with plug-in hybrids? Their drivers.

Plug-in hybrids are often sold as a transition to EVs, but new data from Europe shows we’re still underestimating the emissions they produce.

Google DeepMind’s new generative model makes Super Mario–like games from scratch

Genie learns how to control games by watching hours and hours of video. It could help train next-gen robots too.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at with a list of newsletters you’d like to receive.