Inhabit This Teddy Bear’s Body Using Virtual Reality
Japanese startup Adawarp thinks teleporting inside the body of a robotic stuffed animal could be a good way to keep in touch with loved ones.
New ways of communicating via computers are frequently widely adopted.
Companies inventing things to do with virtual reality headsets like the Oculus Rift, which launches next year, mostly use them to transport you into imaginary worlds. Tatsuki Adaniya has a different idea—teleporting you into the body of a robotic teddy bear.
Adaniya has built software that lets you strap on an Oculus Rift headset and peer out through the bear’s eyes. You can talk to people near the bear through its speaker and hear them through its microphone, allowing for a two-way conversation with you in the role of a stuffed animal.
When you turn your head, so does the bear, thanks to a movement-recording sensor attached to the headset’s strap. An Xbox controller can be used to move the bear’s arms. “We’re broadcasting human body language,” Adaniya says.
Adaniya thinks children and some adults will be interested in taking on the persona of a stuffed animal—such as a bear, cat, or dog—for fun, or as an unusual way to stay in touch with distant friends or relatives. His company, Adawarp, just went through a startup incubator focused on virtual reality companies called River, which invests at least $200,000 in each company in its program. Adaniya’s project began after he broke up with a long-distance girlfriend and thought about what could have helped them communicate.
I tried out Adaniya’s creation in a tiny conference room. When I pulled on the Rift headset I was transported across the table into the body of the bear. Fluffed-out fur rimmed the edge of my vision as I peeked out at Adaniya and, to his left, my own body.
The uncanny sense of being outside myself subsided surprisingly soon. My former body now felt like just a passive observer in the room. Being able to turn my robotic head helped give the feel of a normal conversation by making it possible to maintain a semblance of eye contact. Adaniya helped by focusing his attention on the bear.
“The impression of the word ‘robot’ is scary and big,” Adaniya told me. “I don’t want to feel like this is a robot. I want to feel this is an animal, or a new spirit.” My joystick-controlled arms looked a little bit robotic, but Adawarp plans to eventually capture arm movement directly with a motion-sensor.
By the end of 2016, Adaniya aims to ship a version of his robot with a plain plastic body priced at $200 or less. That version will be aimed at encouraging hardware developers to build their own bodies for it. He is also working on making it possible to control the robot without a virtual reality headset, by panning a mobile phone around. The consumer version will come after and bring back the fur. Adaniya thinks versions that look like cats, dogs, and bears could all be popular.
Cindy Bethel, director of the Social, Therapeutic and Robotic Systems Laboratory at Mississippi State University, says that Adaniya’s idea has some potential but will also face challenges. Children are likely to prefer seeing the face of a parent via video chat to interacting with them in bear form, she says. But the ability to touch or hug a tangible figure could be beneficial, says Bethel.
Having a person take the form of a robot might be a boon in situations where a child needs to talk with an unfamiliar adult, such as a therapist or tutor, says Bethel. A small, cuddly bear could feel less threatening and be easier to open up to than a stranger.
However, Bethel also notes that having a robot take on the role of a person risks the effect known as “uncanny valley,” where an artificial creation tries and fails to be human-like, creating a sense of revulsion instead. “If for some reason it doesn’t move naturally, that could be kind of creepy to people,” she says.