Skip to Content
MIT Technology Review

null
  • Desney Tan

    Age:
    31

    It’s not unusual to walk into Desney Tan’s Microsoft Research office and find him wearing a red and blue electroencephalography (EEG) cap, white wires cascading past his shoulders. Tan spends his days looking at a monitor, inspecting and modifying the mess of squiggles that approximate his brain’s electrical activity. He is using algorithms to sort through and make sense of EEG data in hopes of turning electrodes into meaningful input devices for computers, as common as the mouse and keyboard.

    Credit: Robbie McClaran

    The payoff, he says, will be technology that improves productivity in the workplace, enhances video-game play, and simplifies interactions with computers. Ultimately, Tan hopes to develop a mass-market EEG system consisting of a small number of electrodes that, affixed to a person’s head, communicate wirelessly with software on a PC. The software could keep e-mail at bay if the user is concentrating, or select background music to suit different moods.

    As early as 1929, researchers observed slight changes in EEG output that corresponded to mental exertion. But these results haven’t led to a mass-­market computer-input device, for a number of reasons. Most EEG experiments are conducted in labs where electrical “noise” has been minimized, but outside the lab, EEG is susceptible to electrical interference. EEG equipment also tends to be expensive. And previous research has averaged data from many users over long periods of time; some studies have shown that individual results vary widely.

    Tan believes he can solve these problems by training machine-­learning algorithms–often used to understand speech and recognize photos–to account for variations between individuals’ EEG patterns and to distinguish interesting electrical signals from junk. Contrary to popular practice, Tan keeps his lab as electrically noisy as the average home or office. He is even using the least expensiv­e EEG equipment he could find–a kit he bought for a couple of hundred dollars at a New Age store. (Some people use EEG for meditation.)

    Tan’s EEG cap has 32 electrodes that are affixed to the scalp with a conductive gel or paste. When neurons fire, they produce an electrical signal of a few millivolts. Electronics within the device record the voltage at each electrode, relative to the others, and send that data to a computer.

    A subject using Tan’s system spends 10 to 20 minutes performing a series of tasks that require either high or low concentration–such as remembering letters or images for various amounts of time. EEG readings taken during the activity are fed to a computer, which manipulates them mathematically to generate thousands of derivations called “features.” The machine-­learning algorithm then sifts through the features, identifying patterns that reliably indicate the subject’s concentration level when the data was collected. Tan and his collaborators at the University of Washington, Seattle, and Carnegie Mellon University have shown that a winnowed set of about 30 features can predict a subject’s concentration level with 99 percent accuracy.

    Tan expects the technology to be used initially as a controller for video games, since gamers are accustomed to “strapping on new devices,” he says. In fact, next year a company called Emotiv Systems, based in San Francisco, plans to offer an EEG product that controls certain aspects of video games. However, the company will not discuss the specifics of its technology, and there isn’t widespread consensus on the feasibility and accuracy of the approach.

    The true challenge, Tan says, will be to make EEG interfaces simple enough for the masses. He and his team are working on minimizing the number of electrodes, finding a semisolid material as an alternative to the conductive gel, and developing wireless electrodes. A mass-market product could be many years away. But if Tan succeeds, getting a computer to read your thoughts could be as easy as putting on a Bluetooth headset.

    –Kate Greene