How Armbands Can Translate Sign Language
A research project looks at how gesture-recognition armbands can help the hearing impaired communicate more easily with those who don’t understand sign language.
A pair of Myo gesture-control armbands and a computer or smartphone may make it faster and easier for the hearing impaired to communicate using sign language with those who don’t understand it.
That’s what researchers at Arizona State University say they can do with a project called Sceptre. They use the armbands to teach software a range of American Sign Language gestures; then, when a person wearing the bands makes one of these signs, it can be matched up with its corresponding word or phrase in Sceptre’s database and shows up as text on a screen.
The hope is to facilitate communication in emergency situations in particular, such as at a doctor’s office or hospital, without relying on written communication or, as some other sign-language-recognition research has done, using cameras to recognize sign-language gestures. A paper on the work will be presented at an intelligent user interfaces conference in March.
Researchers relied on the Myo armbands to make Sceptre work because they include both an inertial measurement unit for tracking motion and electromyography sensors for muscle sensing, which can be used to help determine finger configurations. They trained software to recognize a variety of ASL gestures, as well as the signs for individual letters and the numbers one through 10, all performed by someone wearing the Myo armbands.
After having users train their software on 20 different ASL gestures like “pizza,” “happy,” and “orange” by repeating each of them three times, the researchers found Sceptre was then able to decipher the sign correctly nearly 98 percent of the time.
In a video demonstrating how it works, Prajwal Paudyal, a graduate student at Arizona State University who coauthored the paper, wears the armbands and signs several different things that are then illustrated on a computer display, like “all morning,” “headache,” and “can’t sleep.”
Though the signs were shown as text in the study, they could also be spoken aloud by an app to facilitate a conversation, Paudyal says. And while the researchers’ demo showed the text on a computer’s display, which the Myo armbands connected to via Bluetooth, Sceptre could also be used with just a smartphone—something the researchers are also working on (Myo supports streaming data from two wristbands to one smartphone, but the researchers say this wasn’t possible when they conducted their initial work).
“Ideally, the person can use this anywhere they go,” Paudyal says.
Roozbeh Jafari, an associate professor at Texas A&M’s Center for Remote Healthcare Technologies and Systems, has done similar work, though it involved building the sensors rather than using off-the-shelf devices as the ASU group did.
He says there are a number of issues that would have to be solved to make something like Sceptre work for consumers. Typically, when you place electromyography sensors on the body, the system using them has to be calibrated unless they’re in the exact same location they were in previously, he says. There’s also a need to account for variations that naturally occur in the ways people sign the same things. Despite these obstacles, he says, “I think we are moving in the right direction.”
AI is here. Will you lead or follow? Countdown to EmTech Digital 2019 has begun.Register now