Skip to Content

A Robotic Replacement Leg Struts Its Stuff

An advanced prosthesis makes uses of a sophisticated new approach to robot walking.
August 18, 2015

A robotic lower leg that lets amputees walk more comfortably and naturally may herald a shift toward more sophisticated powered prosthetics.

The leg is being developed by Robert Gregg, an assistant professor at the University of Texas at Dallas, and his students. Sensors and software control its motion, ensuring that it quickly adapts to the wearer’s gait to maintain proper balance at all times.

Robotic prosthetics usually require active control throughout the gait cycle, or step. The leg developed in Gregg’s lab uses a more passive approach. Its control algorithm determines the leg’s position and motion using sensors and then performs a single calculation to determine when force should be applied. This makes it computationally and energy efficient, and also provides for a very natural gait, as tests by amputee volunteers show:

“The feedback from the amputee patients we’ve worked with has been very positive,” Gregg says. “They felt like the prosthetic leg seemed to be following them rather than them following the leg. They can start or stop, and the leg will respond; they can go faster or slower, and the leg will respond to that naturally.”

While Gregg’s group created the software for the current prototype, the hardware was developed at Vanderbilt University. Gregg’s group is now developing its own robotic limb, which he says will be even more sensitive to the movement of the wearer’s upper leg. He believes the technology could be commercialized within a few years.

The control algorithms for the leg are based on work done by Jessy Grizzle, a professor of robotics at the University of Michigan. Grizzle’s math provides a more efficient and graceful way to perform dynamic locomotion on two legs. He and others are using the approach to control two-legged walking robots, which require far less power than other designs.

Significant progress has been made toward legged robot locomotion in recent years. Some robots, such as those developed by the Google-owned company Boston Dynamics, are capable of moving dynamically, meaning they can stay balanced while trotting or running even over rough, uneven ground (see “The Robots Running This Way”).

To avoid falling over, however, most bipedal robots often still use a principle that requires precise control over every movement, and rely on a foot being planted on the ground. “That’s not going to work with a prosthetic, because that isn’t how people walk,” says Grizzle.

Legged locomotion is an important goal in robotics because it will enable machines to cover uneven or treacherous terrain and enter buildings designed for humans. But it is still challenging to do in the real world, as shown by many of the robots participating in a competition organized by DARPA in Pomona, California, this June (see “An Obstacle Course May Benefit Robot-Kind”). The robots performed remarkable feats of agility and dexterity but also suffered a number of embarrassing falls.

As natural as the UT Dallas prosthetic limb seems, it cannot yet move between different types of action–for examle, transitioning from walking to climbing a set of stairs–without some sort of external control. Gregg thinks it may be possible to trigger a change without an external controller, by using the wearer’s motion. “I’m working on a new idea for that,” he says.

Keep Reading

Most Popular

Large language models can do jaw-dropping things. But nobody knows exactly why.

And that's a problem. Figuring it out is one of the biggest scientific puzzles of our time and a crucial step towards controlling more powerful future models.

OpenAI teases an amazing new generative video model called Sora

The firm is sharing Sora with a small group of safety testers but the rest of us will have to wait to learn more.

Google’s Gemini is now in everything. Here’s how you can try it out.

Gmail, Docs, and more will now come with Gemini baked in. But Europeans will have to wait before they can download the app.

This baby with a head camera helped teach an AI how kids learn language

A neural network trained on the experiences of a single young child managed to learn one of the core components of language: how to match words to the objects they represent.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.