Skip to Content

The Year in Robotics

During the past 12 months, robots got better at grasping, smiling, and avoiding angry humans.

In the past year, researchers have developed new robots to tackle a variety of tasks: helping with medical rehabilitation, aiding military maneuvers, mimicking social skills, and grasping the unknown. Here are the highlights.

Talking to me?: This robot, called Robovie, uses gaze cues to manage a conversation.

Social Smarts

The socialization of robots was an important area of research this year. Many researchers believe that giving robots social skills will make them better at assisting people in homes, schools, offices, and hospitals. Andrea Thomaz, a TR35 innovator for 2009, created robots that can learn simple grasping tasks from human instructors who use social cues, such as verbal instructions, gestures, and expressions. Another robot, made by a group at Carnegie Mellon University, guides conversations by making “eye contact” to suggest that it’s time to speak (“Making Robots Give the Right Glances”). Researchers at the University of California, San Diego, created a machine-learning program that lets a robotic head develop better facial expressions. By looking in a mirror, the robot can analyze the way its motors move different parts of the face, and create new expressions (“A Robot that’s Learning to Smile”). And a virtual robot mimicked sneakiness, hiding in virtual shadows and darting between obstacles to remain undetected (“Modeling Sneaky Robots”).

Other robots featured this year focused on the mental side of social interaction. One computer program showed that virtual robots that forgot select information created more accurate maps (“Absent-Minded Robots Remember What Matters”). And, in a fascinating experiment, generations of robots in Switzerland eventually evolved to deceive each other when their resources were limited (“Robots ‘Evolve’ the Ability to Deceive”). Lastly, some robots developed a quirky social skill, but one that could lend itself to robotic self-preservation: knowing when humans are angry. Researchers at the University of Calgary used a headband with physiological sensors to program a modified Roomba to move away from a user when it detected stress in the form of muscle tension (“A Robot that Knows When to Back Off”).

Medical Machines

While commercial medical robots such as the da Vinci Surgical System continued to appear in hospitals throughout the country, other researchers focused on how to improve rehabilitation devices designed to monitor and correct a patient’s movement. Knee, pelvic, and hand rehabilitation devices created at Northeastern University use electro-rheological fluid in their motors. This fluid creates resistance when a current is applied, eliminating the need for more hefty or expensive motors (“Robo-Rehab at Home”). The new motors mean that the smart rehabilitation devices are relatively portable and lightweight, so that poststroke patients can continue physical therapy at smaller medical centers or at home.

Another interesting rehabilitation device that debuted this year is based on a modified game controller. The brainchild of researchers at George Mason University, the system is designed to assist with repetitive handwriting exercises. The device, which is relatively cheap and designed to be used at home, may help improve fine motor control in the hands of children with ADHD or mild cognitive impairments (“Cheaper Robot Rehabilitation”).

And for soldiers in the field, researchers created a robotic snake to check for signs of breathing and to deliver oxygen, if needed (“A Robomedic for the Battlefield”). The robot, based on a system originally developed for heart surgery, attaches to a stretcher so the patient can be monitored during transport.

More Robust Gripping

This year was also notable for big advances in grasping technology: simple, fast systems that let robots grab new objects quickly and robustly, using relatively simple hands. Such systems could help improve stand-alone robots and prosthetics. Researchers at Columbia University found that by giving a robotic hand the same limits in dexterity as a human hand, they could make a more efficient device (“Helping Robots Get a Grip”). A group at Harvard and Yale universities also found value in simplicity: its soft plastic hand–embedded with just a few sensors–could pick up unknown objects using a flexible grip (“A Simpler, Gentler Robotic Grip”). A new implant could also bring improvements by giving patients unprecedented control over fine movements of prosthetics ( “Seamlessly Melding Man and Machine”).

On a larger scale, NASA’s new robotic arm could help astronauts by rotating and lifting heavy objects (“A Robotic Arm for Lunar Missions”).

Getting Around: Jogging, Squishing, and Soaring Bots

Boston Dynamics, the engineering company behind BigDog, gave a stunning demonstration this year of its realistic, two-legged Petman robot, which the military will use to test chemical suits (“Meet BigDog’s Two-Legged Brother”).

iRobot released a new video of another robot funded by the U.S. Defense Advanced Research Projects Agency. The Chembot, a deceptively simple-looking blob, will be able to squeeze under doors or through tiny openings, most likely for military surveillance (“iRobot Adds to a Shape-Shifting Robot Menagerie”). Other surveillance robots featured this year include a tiny flier that mimics how a maple seed falls (“Micro-Vehicle Imitates the Winged Maple Seed”) and a new sense-and-avoid visual system for unmanned aerial robots (“How to Make UAVs Fully Autonomous”).

Other robots for home and work also made advances in mobility; a robot developed by a consortium in Europe uses a system based on how a person processes visual information to navigate a cluttered environment. The technology could one day be used for a smart wheelchair (“A Robot that Navigates Like a Person”). Another robot from Brown University learned how to follow a person at a set distance, almost like a well-trained dog, by using a new, infrared image-recognition program (“Robot Plays Follow the Leader”).

Keep Reading

Most Popular

Large language models can do jaw-dropping things. But nobody knows exactly why.

And that's a problem. Figuring it out is one of the biggest scientific puzzles of our time and a crucial step towards controlling more powerful future models.

The problem with plug-in hybrids? Their drivers.

Plug-in hybrids are often sold as a transition to EVs, but new data from Europe shows we’re still underestimating the emissions they produce.

Google DeepMind’s new generative model makes Super Mario–like games from scratch

Genie learns how to control games by watching hours and hours of video. It could help train next-gen robots too.

How scientists traced a mysterious covid case back to six toilets

When wastewater surveillance turns into a hunt for a single infected individual, the ethics get tricky.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.