Using Ultrasound to Feel Virtual Objects
A startup called Ultrahaptics aims to make gesture control and virtual reality more engaging by using ultrasound waves to let you feel like you’re touching virtual objects and surfaces with your bare hands.
Tom Carter, cofounder of Ultrahaptics and a computer science graduate student at the University of Bristol, says the startup’s technology could improve upon touch-free interfaces, such as those enabled by Microsoft’s Kinect, or Leap Motion’s device, by reflecting air pressure waves off the hand in a way that can create different sensations for each fingertip. “You actually feel like you’re interacting with a thing and getting immediate tactile feedback,” he says.
The Ultrahaptics technology is based on research conducted by Carter and other researchers at the University of Bristol.
If the resolution can be improved, applications could include interacting with moving objects in virtual reality games, or improving navigation for the visually impaired by projecting the sensation of Braille letters onto fingers in midair. Carter hopes the first products that include the technology will be available in the next two years.
For now, it’s still in the experimentation stage. A paper about the Ultrahaptics technology is being presented in Toronto at the ACM CHI Conference on Human Factors in Computing Systems, which begins on Saturday. Carter worked with Ultrahaptics cofounder and University of Bristol computer-human interaction professor Sriram Subramanian, and with researchers at the University of Glasgow. The paper explores how well people can sense ultrasonic haptic feedback on different parts of their hands, and as a continuous motion across the hand.
For the study, participants placed a hand, palm facing up, on a table below an array of 64 ultrasound transducers set in an eight-by-eight grid. Each transducer was connected to a circuit board, which was itself connected to a computer. The transducers emitted air pressure waves that bounced off the participants’ hands, giving the sensation of a breeze.
In one experiment, the ultrasound array focused feedback on 25 different parts of the hand to see if participants could pinpoint differences in where, precisely, they felt the waves. In the second, the array emitted waves in a way meant to feel like a line of continuous motion in a specific direction across the hand. Eventually, Carter says, this method will be used to form shapes rather than just lines.
The researchers found that certain parts of the hand are better at detecting ultrasonic feedback—such as across the palm from your thumb to pinky, rather than vertically up the center of your palm. They also determined that a sense of motion was best felt when several waves were emitted for longer lengths of time at different points. Additionally, research indicated that that the smallest virtual shape people could reliably feel was about two centimeters square.
Carter says there are many more perception studies to conduct. Ultrahaptics is also concentrating on refining and miniaturizing the technology, and building it into prototypes that potential customers will be able to try.
Chris Harrison, an assistant professor of human-computer interaction at Carnegie Mellon University who has tried a demonstration of the technology, says using it would make sense in games, especially for adding simple sensations like wind on your face or virtual bullets hitting your chest. He suspects getting the technology to work across a room will be tricky, though, given that it’s essentially vibrating air.
“It’s definitely weird, but it has a lot of potential,” he says.
Keep Reading
Most Popular
Geoffrey Hinton tells us why he’s now scared of the tech he helped build
“I have suddenly switched my views on whether these things are going to be more intelligent than us.”
ChatGPT is going to change education, not destroy it
The narrative around cheating students doesn’t tell the whole story. Meet the teachers who think generative AI could actually make learning better.
Meet the people who use Notion to plan their whole lives
The workplace tool’s appeal extends far beyond organizing work projects. Many users find it’s just as useful for managing their free time.
Learning to code isn’t enough
Historically, learn-to-code efforts have provided opportunities for the few, but new efforts are aiming to be inclusive.
Stay connected
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.