Clicking through multiple layers of menus and scrolling through tiny pages isn’t the most efficient way to work with electronic information-just ask someone squinting at the screen of a personal digital assistant. Liam Comerford and his colleagues at IBM’s Watson Research Center in Yorktown Heights, NY, have developed an alternative that allows interaction through conversational voice commands. Called the Personal Speech Assistant, this handheld goes above and beyond the voice-activated menu commands available with other devices. It understands natural-language queries such as “Show me my address book,” or “When’s my next appointment?” The assistant extracts the pertinent information from its database and answers with synthetic speech. It also tailors its responses based on the user’s needs; if someone forwards through the detailed spoken instructions, for example, the device automatically starts to deliver shorter prompts. As an added bonus, the prototype translates English phrases into any of five languages. The prototype (photo) is a stationary unit that cradles a Palm III, but within a year, the software should be available for handheld-device manufacturers to incorporate into their products. “It’s mostly a matter of the right party stepping up and saying, ‘Gimme,’” says Comerford.
Our best illustrations of 2022
Our artists’ thought-provoking, playful creations bring our stories to life, often saying more with an image than words ever could.
How CRISPR is making farmed animals bigger, stronger, and healthier
These gene-edited fish, pigs, and other animals could soon be on the menu.
The Download: the Saudi sci-fi megacity, and sleeping babies’ brains
10 Breakthrough Technologies 2023
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.