Skip to Content

Visionaries

For our 13th annual celebration of people who are driving the next generation of technological breakthroughs, we’re presenting the stories in a new way. We’ve grouped them by categories that reflect the variety of approaches that people can take to big problems. The Visionaries are anticipating how technology can make life better.

  • Age:
    27

    Eric Migicovsky

    How he invented the smart watch.

    It’s 2008. Eric Migicovsky is racking up kilometers every day on his sturdy blue opafiets—the no-nonsense bicycle beloved by Netherlanders. He’s wheeling to classes at Delft University of Technology and other points in a city famous for its canals and blue-and-white pottery.

    Life’s great for the young Canadian engineer on a year abroad from Ontario’s University of Waterloo. Except for one constant irritant. His cell phone never stops chiming, chirping, or vibrating. And prudence requires two hands firmly gripping the handlebars while veering through traffic between those picturesque canals.

    “I read a survey that said the average person pulls out their cell phone 120 times a day,” he says. “It occurred to me, ‘Hey, what if I could just do it on my wrist?’”

    Back in his dorm room, Migicovsky started fiddling with an electronic breadboard, an Arduino microcontroller, and bits scavenged from a Nokia 3310. The mishmash became a precursor to a prototype—a “smart” wristwatch wirelessly tethered to a cell phone so that it could display e-mails, texts, and other basic notifications. “Plus tell time,” he adds. “That still seemed a useful function for a watch.”

    He eventually transformed his toy into one of this year’s most influential new technologies—the Pebble smart watch. Today Migicovsky runs a company in Palo Alto that has 31 employees and sells watches in Best Buy for $150 apiece.

    Copycats have sprung up, and Apple looms as a likely competitor. Migicovsky is already responding by rethinking how people might use the Pebble. It could become less of a notification display and more of an app platform in its own right. Migicovsky recently released a software developers’ kit intended to help other innovators devise applications solely for the watch—traffic trackers, weather predictors, exercise monitors, and games.

    Getting here wasn’t easy. Back at Waterloo, Migicovsky worked with a few pals on an early version of the watch—the first generation was called “inPulse”—in the garage of their rented house. In 2011, the project was accepted into Y Combinator, which provides modest seed money, advice, and critical contacts for technologists. That brought Migicovsky to California. “If I had to pick someone who will be the next Steve Jobs, it would be Eric,” says Y Combinator founder Paul Graham.

    But big investments remained elusive. As a long shot, Migicovsky posted Pebble on the fund-raising site Kickstarter. He thought he might reel in $100,000. “In 30 days, we raised $10.2 million,” he says. “The smart watch revolution had begun.”

    Colin Nickerson

  • Age:
    33

    Julie Kientz

    If you want to use technology to make life better for people with autism and their families, the trick is to make the technology secondary.

    Julie Kientz is an expert in human-computer interaction. But unlike many other computer scientists, she spends much of her time far away from a computer screen, figuring out the human side of the equation.

    With her people-first perspective on technology, the University of Washington professor is at the forefront of an emerging idea: using relatively simple and common computing tools to improve human health. Kientz has created novel ways of helping people with sleep disorders and families with autistic children, such as a program that uses Twitter to help track key developmental milestones. “I think a lot of people in our area are like, ‘I have a hammer, let’s find a nail,’” says A. J. Brush, a senior researcher at Microsoft. “She’s really thinking hard about what’s the challenge, how to address it, how do I understand it.”

    Kientz’s methods were formed in graduate school at Georgia Tech. Her doctoral advisor, Gregory Abowd, an expert in interactive computing and its use in health care, happens to have two sons with autism. His dedication to them inspired Kientz to investigate technology that could improve their care. But she didn’t begin with the technology. She trained to be a therapist for autistic children and worked as one for a year and a half.

    During sessions with an autistic child, a therapist might ask the child to point to a specific item, like an apple, from an array of objects; to imitate a word or gesture; or to copy the therapist’s arrangement of blocks. Therapists use pen and paper to chart the child’s ability to perform such tasks over time.

    The Lullaby prototype.
    Lullaby, shown here as a prototype, is meant to collect data from people with sleep disorders.

    By working as a therapist and talking to others, Kientz identified problems with the paper-based method. One was that multiple therapists might need to review a child’s records, but there was only one copy of the binder filled with hand-marked charts and notes. And with data points trapped on paper, there wasn’t a good way to visualize broader trends or review negative blips in a child’s otherwise positive progress.

    Kientz’s solution was for therapists to use a digital recording pen and special paper that could digitize their writing. The change was unobtrusive to the therapist and invisible to the child. But notes and chart inputs made their way automatically into a database and were synched with video recordings of each session. This meant therapists could project progress graphs at meetings and pinpoint moments when a child didn’t perform as well as expected. They could immediately access video from that moment in a therapy session; in one instance, therapists reviewed the video and agreed that they each had different standards for a “right” response. As a result, the child was given credit for mastering a skill and could move on to new challenges.

    To Kientz, this human-centered use of computing was an antidote to frustrating internships she had held as an undergraduate at the University of Toledo, including one at Compaq in which she wrote debugging programs for a microchip. “It was really hard for me to see that connection between what I thought was the really impactful work and what I was doing on a day-to-day basis,” she says, speaking in an office littered with geek ephemera such as a software engineer Barbie doll. (Kientz is married to Washington professor Shwetak Patel, an Innovator Under 35 in 2009.)

    Through her work with autistic children, Kientz learned that federal health officials at the Centers for Disease Control and Prevention were looking for ways to spot signs of autism and developmental delays earlier in children’s lives. When she dove in, interviewing parents and doctors, she realized that many families were already recording information the government was looking for, but their formats—snapshots, video, baby books—were hard to integrate with the conventional tracking data gathered by health professionals.

    Kientz wondered if there was a way to combine the two kinds of data gathering. That led her to build a computer program called Baby Steps while she was still in grad school. It combined traditional baby-book functions (asking parents to post pictures of sentimental moments like a child’s first trip to the zoo or to Grandma’s house) with ways to record specific developmental milestones (is the baby making eye contact?). Baby Steps has been tested by a handful of families, and Kientz has a $500,000 grant from the National Science Foundation to explore whether the program could scale up to track milestones for any child in Washington state whose parents want to take part.

    In this project, too, Kientz is deciding how to develop the technology only after first understanding how people might use it. She found that many Hispanic families in Washington don’t have home PCs and are more likely to go online using phones. So she added phone-friendly features such as the ability to respond to prompts from text messages or Twitter. For example, parents can follow a Twitter account that corresponds to the month their child was born. They might get a prompt that includes an age-appropriate milestone and a code so that their reply will get filed in the database. They might see:

    @BabySteps_Nov2012: Does your baby turn his/her head in the direction of a loud noise? #baby68

    And then they could respond:

    @juliekientz: #Yes #Maya turns her head in the direction of a loud noise #baby68

    For another project, Kientz is trying to make it much easier for people with sleep disorders to figure out what’s wrong. Typically, they might have to go to a lab and get loaded up with electrodes for the night; later, they might sit in front of specialized equipment to test things like how their reaction time suffers when they’re experiencing a sleep deficit. Kientz wanted to help people do all this themselves, at home. So she and collaborators from UW’s medical and nursing programs built a prototype called Lullaby. It’s a box with light, temperature, and motion sensors sticking out, wired to a computer and a touch-screen tablet. Patients wear an unobtrusive commercial gadget such as the Fitbit, which tracks exercise by day and sleep patterns by night. They don’t have to fill out sleep logs, which are notoriously inaccurate. And to replace the lab exams measuring reaction times, Kientz’s group developed a smartphone app that lets people test themselves.

    Getting inspiration from actual human problems is leading Kientz and her graduate students in surprising directions—such as software they recently developed to help visually impaired people do yoga. “I feel like there’s two routes you can go in research in my field,” she says. “You can help a lot of people in a little way. Or you can help a few people in a big way.”

    Jessica Mintz

  • Age:
    34

    Per Ola Kristensson

    New computing devices are inspiring new ways to input text.

    Per Ola Kristensson is making it easy, fast, and intuitive to input text on mobile devices. He helped invent the popular gestural text-entry method known as ShapeWriter, but that’s just the beginning. Kristensson, a lecturer in human-computer interaction at the University of St. Andrews in Scotland, thinks gestures could be combined with speech recognition and even gaze recognition in a text-entry system that makes it easier to correct mistakes and enter unpronounceable information like passwords. “I’m interested in optimizing the flow of information from your brain into the computer,” he says.

    ShapeWriter lets you enter text by dragging a finger over the letters in a word. The software then stores the squiggle or shape that you make when you touch those letters as a stand-in for the word itself. The shapes for common words are easy to recall; any time you want to enter such a word, you can quickly reproduce its shape instead of pecking at the letters again. Practiced users can gesture-type in excess of 30 words per minute—blinding speed on the typical mobile device. The ShapeWriter app was downloaded more than a million times from Apple’s App Store before it was bought by Nuance Communications in 2010. Now the technology is built into Android, where it’s called “gesture typing.”

    Kristensson, who has a quick smile and an easy laugh, has always sought to fuse disparate fields of inquiry. Growing up in Sweden, he bucked an educational system designed to channel students into narrow specializations. He was drawn to computer science but couldn’t bear spending four years studying nothing else. So he opted for cognitive science, which enabled him to study not only computer science but also linguistics, philosophy, and psychology. That combination launched him on the path to creating user interfaces that are fundamentally changing the way we interact with computers.

    His work on tools for disabled people illustrates his approach to problem solving. Many people who can’t speak and have very limited manual dexterity communicate by slowly typing words and prompting a computer to pronounce them. Their communication speed averages one or two words per minute. In such a laborious process, predicting the speaker’s intent can greatly accelerate the task. This requires what is known as a statistical language model. “I was amazed to find that in 30 years of development of this kind of technology, no one had produced a good statistical model for the things these people need to say,” Kristensson explains.

    The main problem is the dearth of data from which to derive statistical relationships. You can’t wiretap the computers used by large numbers of disabled people. So Kristensson came up with an alternative: ask people who are not disabled to imagine what they would say if they had to communicate by this method. He used Amazon’s Mechanical Turk to crowdsource imagined communications—”Who will drive me to the doctor tomorrow?” and “I need to make a shopping list.” Then he combed through Twitter, blogs, and Usenet for phrases that were statistically similar to the ones generated by Mechanical Turk. After several iterations, he had the tens of millions of phrases he needed to build a useful model.

    These days, Kristensson is working on technology that supports super-fast typing: a gargantuan statistical language model that accurately interprets typed input despite large numbers of mistakes. He’s also working on new ways to enter text in the absence of a touch screen or keyboard. Such technology will be necessary to make the most of wearable computing devices such as Google Glass, but it will have to work nearly perfectly to be of any benefit, given how frustrating a bad speech-to-text system can be. “In a few years, we’ll have amazing sensors that will help us generate contextual information to create truly intelligent, adaptive interfaces,” he says.

    Ted Greenwald

  • Age:
    33

    Lina Nilsson

    Lowering the cost of basic biological research.

    The UC Berkeley bioengineering lab where Lina Nilsson worked as a postdoc is filled with the kind of expensive equipment necessary for advanced biological research. But many labs around the world don’t have UC-level funding; they rely on hand-me-downs from well-heeled labs or simply do without. That makes it hard for them to find solutions to local problems such as the spread of malaria, never mind participating in the broader scientific enterprise.

    Nilsson offers another option: DIY. As cofounder of Tekla Labs, an engineering collective on the Berkeley campus, she’s curating and distributing open-source, do-it-yourself designs for the gamut of common lab gear. A shaker for separating excess dye from stained cells, for instance, can be made from a discarded record turntable. A centrifuge can be fashioned from a modified kitchen blender. A thermal cycler for amplifying DNA requires only light bulbs and thermometers. In the hands of scientists who historically have lacked access to equipment, such tools can be powerful engines of innovation—generally, Nilsson says, at about one-tenth the price of high-end commercial equipment.

    Left: A magnetic stirrer was designed by a Tekla Labs contributor in New Zealand. Right: A rotator built by a member of the Tekla Labs team is designed to gently agitate biological samples.

    “Great ideas are everywhere, but opportunity is not,” she says. “My goal is to enable people to collaborate to solve global challenges.” Along with her work at Tekla Labs, she serves as innovation director at UC Berkeley’s Blum Center for Developing Economies, where she devises programs that bring together NGOs, scientists, engineers, and local organizations worldwide.

    Nilsson was an outstanding, if uninspired, PhD candidate at the University of Washington in 2007 when, on a whim, she applied for a Bonderman Travel Fellowship, an open-ended program that gives students eight months to “come to know the world in new ways.” She traveled to Asia and South America, where she visited local biology labs. “It completely changed everything about how I see the world,” she says. “The discordance between the engagement of the scientists and their empty labs was jarring, and the vision for Tekla Labs started to emerge.”

    The challenge now is to make sure Tekla Labs’ designs consistently yield devices precise and durable enough for serious research. After all, scientists everywhere need equipment they can rely on.

    Ted Greenwald

  • Age:
    25

    Steve Ramirez

    An MIT grad student can find and even change memories in a mouse’s brain.

    “My parents came here from El Salvador in the late ’70s to escape from civil war. They worked 100-hour weeks to give me and my brother and sister the opportunity of a better life. Years later, we have all these opportunities that we couldn’t have dreamed of in El Salvador. I can’t think of any better motivator.

    The first seeds of my interest in the brain were planted between junior high and high school, when my cousin went into labor. While under anesthesia during a C-section, she went into a coma that she’s been in ever since. The parts of her brain that are involved in producing consciousness and wakefulness were probably atrophied because they didn’t get enough oxygen for just a short period of time. It instantly hit me: all it takes are these little lumps of tissue in your brain to atrophy, and now everything that makes you you is evaporated.

    Because the seemingly ephemeral stuff of cognition is based on the physical stuff of the brain, we can go in and manipulate it and see how something as complicated as memory works. When you are thinking of a memory, only a subset of brain cells are active, and those cells are specifically representing that memory. We can genetically modify neurons to produce a sensor that detects when brain cells are active and then installs an on-off switch in them. The switch is a protein that allows us to control the activity of a cell with light.

    So now we can emit light and reactivate cells and see whether a mouse exhibits behaviors that show whether it is recalling a certain memory. We place the animal in a box where it gets mild foot shocks from the floor. Naturally, if we later put the mouse back in the box, it runs to a corner in fear—it sits there and freezes, crouching and monitoring. Next, we put the mouse in a completely different box—different smells, sights, floor texture. In this new box the mouse has no reason to be afraid. But when we shine a light to reactivate the cells involved in making that fear memory, the animal immediately goes into that defensive posture. We can also shine light and reactivate pleasurable memories, such as a male mouse’s memory of a female mouse.

    In my second project, we tried to get a mouse to believe that it experienced something that it didn’t. We called it Project Inception. First, we label the brain cells that are involved in the memory of a chamber—environment A—where nothing bad happens. The next day we put the mouse in environment B, where it gets foot shocks, and we simultaneously shine a light to reactivate the memory of environment A. Then, when you put the mouse back in environment A, it displays freezing behavior. It is recalling falsely that it was shocked in environment A even though nothing happened there.

    We are pushing this technology as far as possible. Perhaps we can alleviate post-traumatic stress disorder by erasing the underlying traumatic memory. Or perhaps we can treat certain types of depression by updating negative memories with positive emotions. Science fiction can often inform reality.”

    as told to Susan Young

  • Age:
    29

    Laura Schewel

    Looking more closely at the way people move through cities.

    When Laura Schewel worked for an energy think tank and then the Federal Energy Regulatory Commission, she wanted to develop policies that would stimulate sales of electric cars. The trouble was, there wasn’t comprehensive and reliable data about where and when people drive.

    Typically, transportation experts construct predictive models to describe traffic patterns, or they conduct expensive surveys. Neither is particularly easy to do. “We have no idea what’s happening on the roads. Just none,” Schewel says. “When you compare that to what we know about what people watch on TV, it’s absurd.” 

    While in a PhD program at the University of California, Berkeley, she realized that people actually were revealing where they drive—to their cell-phone companies and GPS navigation services. She thought: what if I could get access to that data? It took a year to persuade companies to sell this valuable and sensitive information to a small startup she formed, StreetLight Data. The company, which aggregates and analyzes the signals from cell phones and dashboard GPS navigation systems, makes it easy for just about anyone to do what Schewel had long envisioned—see detailed maps of where, when, and how people travel through cities.

    With software that she and her team developed, Schewel can type in an address and find the demographics of the people who drive by or stop near that location. The system shows when they drive by, how frequently, and even what neighborhoods they’re coming from. (Importantly, Schewel’s algorithms analyze the movements of groups of these devices, rather than individual units. That means StreetLight’s analytics can’t be reverse-engineered to reveal any given individual’s movements.) 

    The information is appealing to customers far beyond the transportation–policy world. A medical office, an auto repair shop, and a small restaurant chain have been using StreetLight’s software to help them decide where to open new locations and place billboards. And the nonprofit Oakland Business Development Corporation is using the software to demonstrate that people with disposable income often spend time in Oakland even if they don’t live nearby. The data, the group hopes, will encourage small businesses and national chains to consider opening up shop in the city’s struggling downtown, which has 400 vacant storefronts and office buildings in one square mile.

    Schewel still believes she can make transportation more efficient. But rather than trying to persuade people to be green, she is focused on helping businesses—which have become “the most powerful behavioral-change force in America”—make it easy for people to do greener things. For example, if suburbanites can do some shopping near their offices in downtown Oakland on their commutes home, that might reduce the mileage they would otherwise have to drive. Naturally, Schewel backs up that idea with data: 30 percent of all miles driven in the U.S. are related to shopping.

    —Jessica Leber

  • Age:
    34

    Yu Zheng

    Analyzing newly available data about the intricacies of urban life could make cities better.

    Commuting through Beijing’s apocalyptic congestion and pollution can test anyone’s patience. But it has inspired big ideas from Yu Zheng, lead researcher for Microsoft Research Asia.

    Take pollution. Most air-quality monitoring systems in China give a reading for an entire city. But air quality can vary greatly within cities depending on traffic, building density, and weather conditions. Zheng is taking that into account with a new project, U-Air. It analyzes current and past data from monitoring networks and many other sources to infer air quality at any given point in the city. Eventually Zheng expects the system to predict air quality one or even five hours in advance. That could help people figure out, say, when and where to go jogging—or when they should shut the window or put on a mask.

    In an earlier project, Zheng and his team showed that online mapping services could recommend much better driving directions by taking gridlock into account rather than just finding the shortest routes. The trick was to learn from Beijing taxi drivers, who are forced to find the smartest routes every day. Zheng’s group analyzed GPS data from 33,000 Beijing cabbies and figured out how to teach their subtle methods to a mapping program.

    “When I see a problem,” he says, “I feel passionate about trying to solve it.”

    Michael Standaert