Cholera kills, and vaccines don’t always work. She created a better solution.
Cholera affects millions of people annually in the world’s poorest communities. It’s often treated with antibiotics, but they’re not ideal because they harm the bacteria in the gut, and antibiotic resistance is on the rise.
Minmin Yen developed a better solution: bacteriophages, or viruses that specifically target bacteria. What’s significant about Yen’s intervention is that it works immediately to kill the bacteria and prevent the disease from developing. Existing vaccines, in contrast, can take weeks to work.
Yen, who earned a PhD in molecular microbiology at Tufts University, says bacteriophages have been mostly unexplored because antibiotics are so prevalent, but she thinks it’s time for them to play a larger role now that resistant bacteria are so common. She has started a company, PhagePro, to bring her intervention to market.
Her tech nonprofit makes it easy for women to build a domestic-abuse case without a lawyer.
Hera Hussain is empowering women around the world via a simple combination of social and technological innovation: enlisting volunteers to crowdsource multilingual online guides covering topics like how to build a domestic-abuse case without a lawyer, or how to identify psychological manipulation.
It all started after Hussain tried to help two friends escape abusive marriages. “You would think in the UK it would be easy to find information about how to get a divorce, how to apply for asylum, what are the laws to apply for child custody,” she says. “But it was frighteningly difficult to get that information.”
In 2013 Hussain founded the open-source, nonprofit organization Chayn in her spare time—to make the missing information easy to find and understand. Today, 70 percent of Chayn’s 400 volunteers are survivors of violence and oppression themselves. Their guides are built largely from crowdsourced research and firsthand experience of the overlapping psychological, cultural, and legal complexities involved in oppression against women.
Hussain says she’s lost count of the times she’s been lectured—mostly by men—that Chayn’s guides shouldn’t be written by people without legal or academic backgrounds. “You get talked down to a lot,” she says. She likes to counter their arguments with the example of a woman from India who for years was trying to figure out how to leave her abusive marriage. All the resources the woman found online were written by Indian lawyers, almost all men, lamenting that women took advantage of divorce laws rather than be dutiful wives.
Hussain continues to push Chayn to harness appropriate technologies to deepen its reach. A new chatbot, for instance, guides visitors to the most relevant information in as few clicks as possible.
Working to alleviate human suffering through AI.
Mustafa Suleyman cofounded the AI company DeepMind out of a desire to have as broad an impact on society as possible. AI, he decided, was the fastest way to do it.
Now Suleyman has launched DeepMind Health to build AI that can better diagnose disease, including systems that detect early-stage eye disease and help analyze mammograms. He’s also focusing on how such technology is used by medical clinicians. “The tech community is only just finally catching up in thinking about the ethical impact of these systems,” says Suleyman. For instance, will time-pressed clinicians simply defer to the AI’s top suggestions without critical evaluation? How will such systems be audited? And how can new medical findings take into account implicit biases in old data used to train the AI? “I think this is going to be the year when Silicon Valley and the technology companies come to really accept the incredible social responsibility that such great power carries,” he says.
Last year, Suleyman launched the DeepMind Ethics & Society unit to design systems that anticipate and direct algorithms’ decision-making processes and their impact on society.
“The big pivot that technology companies are going to make,” he says, “is to ask the question: How do we shape these algorithms so they represent the moral choices that we collectively elect to make?”