Clicking through multiple layers of menus and scrolling through tiny pages isn’t the most efficient way to work with electronic information-just ask someone squinting at the screen of a personal digital assistant. Liam Comerford and his colleagues at IBM’s Watson Research Center in Yorktown Heights, NY, have developed an alternative that allows interaction through conversational voice commands. Called the Personal Speech Assistant, this handheld goes above and beyond the voice-activated menu commands available with other devices. It understands natural-language queries such as “Show me my address book,” or “When’s my next appointment?” The assistant extracts the pertinent information from its database and answers with synthetic speech. It also tailors its responses based on the user’s needs; if someone forwards through the detailed spoken instructions, for example, the device automatically starts to deliver shorter prompts. As an added bonus, the prototype translates English phrases into any of five languages. The prototype (photo) is a stationary unit that cradles a Palm III, but within a year, the software should be available for handheld-device manufacturers to incorporate into their products. “It’s mostly a matter of the right party stepping up and saying, ‘Gimme,’” says Comerford.