The cofounder of Snapchat figured out that people wanted something different from social media.
At the center of Snapchat—the disappearing-photo social network valued at $20 billion, used by 150 million people—sits an exotic-car-driving, engaged-to-a-supermodel 26-year-old genius. Or jerk. Or both—it’s hard to tell. Evan Spiegel is kind of a recluse. The guy behind this new media empire follows only about 50 people on the mobile app he helped create. (One of them is the magician David Blaine.) He declined to speak to me, which is fitting, because what Snapchat is, what Spiegel understands better than anyone, might be the opposite of an interview with a magazine.
Snapchat is often compared to Facebook, and Spiegel to Mark Zuckerberg. Which makes sense, especially since Facebook tried to buy Snapchat for $3 billion before releasing its own knockoff versions that promptly fell into irrelevance. And both founders are college dropouts (Spiegel from Stanford, Zuckerberg from Harvard). But Facebook is a company built on making your personal data public and delivering targeted ads; the whole point of Snapchat is to delete your images or videos after you send them to your friends. Snapchat, Spiegel has said, is based on the idea that “ephemeral should be the default.”
In its six years of existence—an epoch in startup time—the company has outlasted rivals like Poke and Ansa and Gryphn and Vidburn and Clipchat and Efemr (I swear I’m not making these up) and Wink and Blink and Frankly and (I promise you) Burn Note and Glimpse and Wickr. It reaches 41 percent of U.S. 18- to 34-year-olds every day and generates revenue from media companies and advertisers that publish snaps in dedicated channels. What did Snapchat do right that others didn’t? One thing you immediately notice upon downloading the app is how much it requires of you. You can’t just sit back and watch—you, too, must snap. The home screen practically begs you to take a picture or shoot a video. Photography once was all about capturing a moment forever; Spiegel’s great insight was that now the best way to make people pay attention is to capture that moment, share it, and watch as it disappears.
To build better machines, a roboticist goes far outside her field for guidance.
Nora Ayanian calls robots people. It’s not some weird affectation; it helps her with her work.
She’s a computer scientist who thinks machines should work together to get things done. Let’s say a farmer wants to have drones autonomously survey crops and take soil samples. You couldn’t program each drone with the same set of commands, because each would have a different task and would have to solve different problems as it navigated. You know what is good at solving problems on the fly, in a group that draws on various skills from different individuals? People.
So Ayanian studies robot coördination by studying people. One way is by having groups of humans play a simple video game that limits their senses and stifles communication. They need to figure out how to do “something meaningful” together, as she puts it, such as arranging their on-screen figures into a circle. Ayanian watches how people coöperate on such tasks with as little information as possible.
Why not just create a dictator robot—one machine that sees the whole field and directs other drones? Well, Ayanian counters, what happens when the dictator robot runs out of power? Or crashes? Distributed and diverse teams, she says, are always better at problem-solving, once they learn to work together.
The creator of control software for drones has foreseen the advantages of autonomous aircraft for years.
As an engineering and computer science student at MIT, Downey starts a group that builds drones and competes against other colleges.
While working for Boeing, he develops flight-control software for an autonomous helicopter funded by the Pentagon.
Founds a startup called Airware out of frustration with what he calls “inflexible and costly” autopilot systems for unmanned aircraft that made it hard to add new capabilities. Also spends five months flying tourists in a turboprop plane between Las Vegas and the Grand Canyon.
Airware ships its first control software to drone manufacturers.
General Electric invests in Airware, saying drones could help make it safer and cheaper to maintain industrial equipment such as power lines.
Airware launches several products intended to help big companies use drones. For instance, software designed by former game developers lets companies take aerial photos of sprawling facilities as easily as you would click on a map. State Farm uses Airware’s technology to inspect roofs after weather damage.
U.S. regulators remove rules that had tightly limited what companies could do with drones, clearing a path for many more companies to use Airware’s services.
An industry group, the Association for Unmanned Vehicle Systems, predicts commercial drones will have created $80 billion in business value and 100,000 jobs by this time. “We will not be able to imagine doing our jobs without them,” says Downey.
A scientist who is developing new gene-editing techniques also warns of their potential.
Works at MIT’s Media Lab to develop ways of influencing how ecosystems evolve.
The Back Story
Visited the Galápagos Islands at age 10. “I knew evolution would impact what I wanted to do.”
His Burning Issue
Gene drives, a new technology that could be used to quickly spread traits among wild creatures such as mosquitoes.
What’s at Stake
Wiping out mosquitoes, and maybe malaria. “Unimaginable amounts of suffering occur in the wild, and evolution doesn’t care,” he says.
Are gene drives safe enough to ever use in the open, or will they have dangerous unintended consequences?
No gene drive able to spread globally should be released, he argues. Or even tested. Scientists need to disclose their plans.
He’s designed safer gene drives that can be controlled.
Raising awareness about the potential threats of gene drives is “a home run for biosecurity,” says the FBI.
Risky ones. Unicycling and hang-gliding.
People on the autism spectrum are inspiring her novel approach to creating artificial intelligence.
“My research began in graduate school when I was working on artificial-intelligence systems and read Thinking in Pictures by Temple Grandin, a professor of animal science who talks about how her autism gives her this unique visual way of thinking compared to most people.
“I thought: That’s interesting. Most AI systems are not ‘visual thinkers’ like her. Most AI systems use variables, numbers, lists, and so on, and they reason using mathematical and logical operations. These systems are ‘verbal thinkers.’ What if you had an AI system that used data made up entirely of images and reasoned only using visual operations, like rotating images around or combining images together? If Temple Grandin can do amazing things because of her visual thinking abilities, it seemed to me that the same should be true of AI systems.
“I’ve been taking what we learn from people on the autism spectrum who have interesting visual abilities and building that into AI systems. It’s early, but I expect that they ultimately will be very valuable. If we want to help students learn to solve difficult problems, then we ought to have several AI tutors that can show students different ways of solving the same problem. If we want to help doctors find patterns of disease outbreaks, then we ought to have multiple AI analysts that can sift through the data using different styles of pattern finding.”
—as told to David Talbot
Why don’t computers keep our personal data secure by default?
When programmers create a feature for an app or a website, even something as simple as a calendar, they should code in protections so the personal information that the feature needs to access—such as your location—doesn’t slip out onto the Internet. Needless to say, they sometimes fail, leaving our data to be exploited by hackers. “Just like there are many ways to sink a boat,” says Jean Yang, “there are many ways to leak information.”
That’s why Yang created Jeeves, a programming language with privacy baked in. With Jeeves, developers don’t necessarily have to scrub personal information from their features, because Yang’s code essentially does it automatically. “It is a double hull for information leaks,” Yang says.
She has uploaded the code to open-source libraries for anyone to use. And this fall she begins as an assistant professor of computer science at Carnegie Mellon, where she can try to get her ideas to spread further. “Giving people tools to create technology is incredibly empowering,” she says.