While wearables have gotten pretty good at tracking a range of health-related activities, there’s still no simple, automated, unobtrusive way to track one of the vital things we all do every day: eat.
Researchers at Dartmouth College and Clemson University have built a prototype of a headband-like device that they think could be a step toward making this happen. Part of a project called Auracle, the gadget uses a microphone placed on the skin to capture sounds of mouth-related activities that are then analyzed to pinpoint when the wearer is eating—an activity that can be tricky to distinguish from things like talking or coughing.
Ryan Halter, an assistant professor of engineering at Dartmouth and one of the leaders of the project, says that the researchers are aiming to shrink their prototype down to hearing-aid size so that it could fit behind the ear.
Auracle isn’t trying to count calories as you eat. That’s a notoriously difficult task, in part because, as Halter notes, plenty of foods may look and sound the same but have vastly different calorie and fat contents, such as low-fat and whole-milk yogurt.
But Halter thinks Auracle could be helpful in the near future for other researchers and doctors who want to study behaviors such as dieting or eating disorders. It could offer a way to figure out things like when we eat and for how long without relying on people to keep their own records (something that can be far from accurate).
“People forget to write things down. People misconstrue what actually happened,” Halter says. “This might provide more of a ground truth about someone’s eating behavior.”
Recommended for You
Eventually, it could also lead to a consumer wearable, such as one aimed at dieting or simply tracking eating in general. Halter can envision such a device working with your phone to send you a mid-snack message like, “Are you sure you want to keep eating all of this?”
The project is still in the early stages. So far, the researchers have conducted experiments where a group of people wore their prototype (some with just the microphone) while eating a variety of soft and crunchy foods and doing other things like talking, coughing, and sniffling. The early results show about 90 percent accuracy at distinguishing eating from other activities.
Getting Auracle to work accurately outside the lab is vital for its utility, though, and the researchers know this will be tricky. That’s partly because other settings present a lot more noise to filter out, and skin contact with the wearable device can change over time.
AI is here.
Own what happens next at EmTech Digital 2019.