Skip to Content
Humans and technology

Amazon’s Astro robot is stupid. You’ll still fall in love with it.

From Jibo to Aibo, humans have a long track record of falling for their robots. Except this one’s sold by Amazon.

October 4, 2021
gif of amazon astro robot turning around and winking
Amazon

On September 28, Amazon introduced Astro, a “household robot.” Amazon’s launch video promises that the $999 robot, which is squat with two wheels and a rectangular screen that features two orbs for eyes, will be able to do things like watch your home or join impromptu dance parties.

This being Amazon, there’s good reason to be skeptical, especially since Astro is essentially a giant camera on wheels that will watch everything you do. So why would anyone be happy to have one in the house? The reason lies in the way our brains are wired. Years of robotics research and previous iterations of robotic assistants and pets (or “robopets”) have shown that people can’t help falling in love with them. 

Owners can become fiercely attached to their robopets. In a 2019 review of studies, scientists found that much like real pets, robopets—which included Paro (robotic seal), Justocat (robotic cat), Aibo (robotic dog), and Cuddler (robotic bear)—reduced depression and improved well-being for senior citizens. They happily caressed the robopets despite being fully aware that they weren’t actual animals. As one woman put it: “I know it is an inanimate object, but I can’t help but love her.”

And it’s not just robopets. Studies and anecdotes have shown that the Roomba—the self-propelled, disc-shaped vacuum cleaner—is often considered “part of the family,” and may even be assigned a gender and name. When the plug was pulled on the servers that powered Jibo, one of the first “social robots,” people mourned. Sony’s robot dog Aibo was completely useless, yet people held funerals for them when they finally broke down after Sony had discontinued the line.

Why do we do this? It all starts with trust, says UCLA’s Mark Edmonds. He has studied why humans trust robots, and he says that by default, we tend to trust machines to do what they’ve been programmed to do. That means machines have to maintain trust rather than build it.  

Trust goes two ways here with Astro. On the surface level, there’s the trust that Astro will follow commands efficiently and well. The deeper trust issue facing Amazon is the company’s volatile history in terms of surveillance and privacy, especially because Astro is primarily used for home surveillance. But Edmonds says some users may be willing to be less critical of that second, creepier trust issue if Astro just does what it’s told. “Astro has to get the functionality right first, before intimacy,” Edmonds says. “Functionality is the harder technical dimension.”

Getting humans to trust Astro may seem difficult, but Amazon has built in some key design elements to help them along, beginning with its “eyes.” It’s hard to call Astro cute—its “face” is really just a screen with two circles on it—but the circles recall the magnified eyes and dimensions of a child or baby animal. 

Robopets have long been designed with giant eyes and pouty features to make them instantly adorable to the human brain. In the early 2000s, MIT researcher Sherry Turkle began studying children who interacted with Furbies. She found that while the kids knew they were just toys, they still developed deep attachments to them, thanks in large part to their physical appearance. 

In a 2020 follow-up, Turkle writes that the therapeutic robot Paro’s eyes make people feel understood and “inspire [a] relationship… not based on its intelligence or consciousness, but on the capacity to push certain ‘Darwinian’ buttons in people (making eye contact, for example) that cause people to respond as though they were in relationship.”

Kids might be especially prone to feeling like Astro has the capacity to have a relationship with them. Judith Danovitch, an assistant professor at the University of Louisville who studies how kids interact with Alexa, says that Astro’s height, eyes, and cutesy look are definite “cues of personhood,” which might both fascinate and baffle children, particularly younger ones who are trying to figure out how to interact with other people.

“Being self-propelled is a cue for animacy for babies,” Danovitch says. “In the natural world, humans and animals are self-propelled. Rocks and other inanimate objects aren’t. It will be a challenge for young kids to understand them.”

Astro might have a secret weapon in making us fall for it: it’s really not that advanced yet. Vice got a hold of leaked documents that suggest the robot is not quite as slick as the launch video suggests (Amazon disputes this). At the moment, it can patrol the home with its built-in camera, play music, or let you make video calls. It can recognize what room it’s in and tell inhabitants apart using facial recognition.

That’s pretty much it, for now. But that isn’t necessarily negative. Astro’s relatively limited set of functions could be key to helping it integrate into our families. Research has shown that people easily lose trust in robots that struggle to carry out their basic functions. “Trust is broken when machines are irrational or do the thing we don’t expect them to,” says Edmonds. The fact Astro can’t actually do much might limit its chances to mess up (and creep us out). 

“Ease of use is often a bigger predictor of home robot acceptance than explicit utility,” says Heather Knight, an assistant professor of computer science at Oregon State University whose research focuses on human-robot interaction. What makes voice assistants like Alexa so powerful is that to use them, you just plug them in and yell out their name and a command.

Amazon certainly sees Astro as a future member of the family. “We think Astro will be great for families; as we said in our blog post introducing Astro, ‘In testing, we’ve been humbled by the number of people who said Astro’s personality made it feel like a part of their family, and that they would miss the device in their home after it was gone,’” Kristy Schmidt, a spokesperson with Amazon, said in an email. And getting kids to like Astro is folded into the design: Schmidt said that Amazon Kids, the Alexa service that lets children interact and play games on the firm's smart speakers, is usable with Astro.

As robots become more ingrained in our lives, that kind of blurring between business and personal could create a tricky conflict of interest. When you develop a relationship with your robot, what are the ethics of it trying to sell you something from its manufacturer?

This could be especially problematic for children, who don’t have the capacity to understand advertising might pitch a product or service that doesn’t look exactly like what they see on TV or other media. “My guess is that when Amazon tries to share something and give a persuasive message [Astro], they’ll be confused,” Danovitch says. That could lead to an onslaught of ethical problems.

And yet, despite all this, it’s likely that we'll welcome some future version of Astro into our homes and fall for it—because we are humans, and that’s what we do.

Deep Dive

Humans and technology

Unlocking the power of sustainability

A comprehensive sustainability effort embraces technology, shifting from risk reduction to innovation opportunity.

Building a more reliable supply chain

Rapidly advancing technologies are building the modern supply chain, making transparent, collaborative, and data-driven systems a reality.

Building a data-driven health-care ecosystem

Harnessing data to improve the equity, affordability, and quality of the health care system.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.