Skip to Content
2021

Inventors

Their innovations point the way toward light-based chips, better gene editing, and skin-like electronics.

Lan Truong
  • Age:
    34
    Affiliation:
    IBM

    Xiao Sun

    He designs imprecise—but energy-efficient—AI hardware and software.

    Artificial-intelligence systems often require a vast amount of computation. That’s why in recent years, AI hardware researchers have been striving to achieve lower precision, which is good enough to produce a correct answer but avoids the use of calculations that require keeping track of lots of digits.

    Deep learning relies on networks that might have dozens of layers, and millions, or even billions, of parameters that must be adjusted to the correct values, a process called training the network. This often takes days or weeks of computations using hundreds of specialized chips.

    Xiao Sun is part of a research group at IBM that has been finding ways to perform those computations using three-digit, or even just two-digit, numbers (in contrast, a modern laptop or cell phone uses 20 digits to make calculations, while most dedicated machine-learning chips use five). 

    The real trick is in finding techniques that allow for small numbers to be used throughout the computation. You might still have to do many trillions of computations, but each one will be far simpler. This saves both time and energy—using two-digit numbers is more than 20 times more energy efficient than doing the same calculations using numbers in the billions, according to a paper by Sun and colleagues at IBM.

    In February, IBM announced a new chip, based in part on Sun’s work, that trains neural networks using computations involving mostly three-digit numbers. The company hopes to use it not only to train large neural networks in cloud computing centers but also in mobile phones that could train on local data.

  • Age:
    29
    Affiliation:
    Bolt Biotherapeutics

    Shelley Ackerman

    She co-invented a novel immunotherapy for difficult-to-treat cancers.

    Using the body’s own immune system to fight cancer has shown promise against several types of tumors, but it’s not always effective. “There’s a whole subset of patients that it really doesn’t work well in,” says Shelley Ackerman.

    Tumors have to be “hot,” or inflamed, for immunotherapy drugs to work well. Hot tumors are characterized by the presence of a type of immune cell called T cells. Immunotherapy drugs give those T cells a boost, making them better cancer fighters. But many tumors are “cold” and thus evade the immune system. Without any T cells to work with, immunotherapy drugs fail against these tumors. 

    As a graduate student at Stanford, Ackerman worked with Edgar Engleman, a professor of medicine and pathology, to develop a therapy aimed at turning cold tumors into hot ones. The approach uses a tumor-targeting antibody chemically attached to an immune-stimulating small-molecule drug that prompts the immune system to recognize and attack the tumor, transforming it into a hot one invaded by tumor-killing T cells. Engleman founded a biotech company, Bolt Biotherapeutics, in 2015 to commercialize the approach; Ackerman joined Bolt in 2018. 

    As a child, Ackerman lost her uncle and a close friend to metastatic cancer within a year, and that experience made her want to keep working on the therapy in hopes that it would one day be used to treat patients. 

    Last year, Bolt began testing its approach in patients with breast, gastric, and other tumors that express a protein known as HER2. The company, which has raised $438 million in funding, is also developing drugs for colorectal, lung, and pancreatic cancers.

  • Age:
    32
    Affiliation:
    Google

    Ryan Babbush

    Efficient quantum simulation algorithms might help find novel, powerful materials.

    Molecules are complicated. Forget the grade-school picture of electrons orbiting a nucleus like planets around the sun. Electrons can be shared among many atomic nuclei. They interact with one another in ways described by the equations of quantum mechanics. It’s these complex interactions, which grow exponentially with the number of electrons, that largely govern chemical reactions and the properties of molecules.

    Simulating these electrons with perfect precision might take a conventional computer millions of years. But algorithms running on quantum computers might be able to perform precise computations in days or even hours. This would provide clues on how to precisely design molecules with desired properties and tailor their reactions with amazing control. 

    Sufficiently precise quantum simulation might allow chemists to create new compounds like better high-temperature superconductors, catalysts that could take nitrogen or carbon dioxide out of the air, new drugs, more efficient solar cells, strong lightweight materials for airplanes, and so forth. It would be a way to quickly figure out how a new substance would behave without actually having to synthesize it. It might herald a new age of materials science.

    Between 2014 and 2020, Ryan Babbush published dozens of papers—together with collaborators at Google and elsewhere—that outlined dramatically more efficient quantum simulation algorithms. The upshot is that some quantum simulation calculations could, in principle, be done in hours, on a sufficiently powerful quantum computer.

    Take the case of nitrogenase, an enzyme that some bacteria use to remove nitrogen from the air and create ammonia, a compound of nitrogen and hydrogen. This process, known as nitrogen fixation, is essential for agriculture, which is why nitrogen-based fertilizers are a linchpin of the world’s food system. Nitrogenase is a big molecule that includes a catalytic site known as FeMoco.

    Currently, an energy-intensive technique known as the Haber-Bosch process produces most fertilizers, accounting for about 2% of humanity’s total energy usage. “If we could figure out how that enzyme [nitrogenase] is doing this, then we might be able to design an industrially viable alternative for producing fertilizer, which could scale and save a huge amount of energy,” Babbush says. 

    He and his collaborators have found a potential way to use a quantum computer to analyze FeMoco and shed light on the mechanism by which it first breaks the bonds between nitrogen atoms that are bound together in nitrogen gas and then succeeds in combining the nitrogen with hydrogen. (Babbush acknowledges that competing approaches using clever approximations to simulate molecules on classical computers might get there first.)

    Another line of research that Babbush has advanced aims to figure out how quantum computers can calculate the behavior of electrons in metals and crystals. Potential applications could include finding better superconductors or making more efficient solar cells. In these materials, the repeating pattern of the atoms creates very complicated behavior among the interdependent electrons. And Babbush is figuring out how quantum computers can be used to make sense of these interactions.

    If quantum computers succeed in remaking our material world, Babbush’s work will be one reason why.

  • Age:
    33
    Affiliation:
    North Carolina State University

    Amay Bandodkar

    His lightweight sensors could make wearable tech more useful and practical.

    Wearable technology can provide real-time information about a person’s health and fitness, but creating sensors that can collect data without a cumbersome and impractical system of staying powered has proved difficult. Amay Bandodkar thinks he’s hit on a new way of creating “self-powered” biochemical sensors through unconventional technologies, making wearable tech lighter and less cumbersome. It’s about four times smaller and 20 times lighter than similar devices produced two years ago, he says.

    The key to shrinking the sensor was overhauling how it’s powered. “All the groups that were working on this were using these really bulky batteries, and the sensor was around 3% of the total size and weight,” he says. So he built a sensor that doesn’t require a battery: it harnesses the catalytic properties of enzymes to generate signals without the need for power sources. While this concept can be used to develop self-powered sensors for some chemicals, for other kinds of sensors that still need a power source, Bandodkar has developed a lightweight battery that runs on sweat. It’s made of a magnesium anode and a cathode made of silver and silver chloride, separated by a dry cellulose membrane. When a person wearing it starts to perspire, the cellulose membrane absorbs the sweat and acts as an electrolyte, effectively turning the battery on and powering the sensor.

    Bandodkar has successfully tested out a heart-rate sensor running on his battery, opening the gateway for heart-monitoring wearables.

  • Age:
    30
    Affiliation:
    MIT

    Jonathan Gootenberg

    Expanding the capabilities of gene editing.

    The gene-editing tool CRISPR uses a protein called Cas9 to snip out a targeted part of the genome. It does amazing work but has downsides. It can cause unintended edits to other places in the genome, and if you want to make only a temporary tweak, Cas9 can’t do it. 

    Jonathan Gootenberg is creating other editing tools to get around these shortcomings and add to the capabilities of CRISPR. ­Gootenberg has used Cas12a more compact protein than Cas9to edit many genes at a time. This capability could be used to edit a patient’s immune cells so that they fight cancer. 

    Then there’s Cas13. ­Gootenberg and his colleague Omar ­Abudayyeh (a 2020 Innovator Under 35) demonstrated that the protein could target RNA instead of DNAan intriguing finding. Many viruses have RNA as their genetic material, and bacteria have both DNA and RNA, so the researchers reasoned that you could use Cas13 to find genetic material from pathogens in human cells. That led to repurposing the gene-editing tool as a paper-based diagnostic test, and in 2019 ­Gootenberg and Abudayyeh cofounded Sherlock Biosciences to commercialize the technology. 

  • Age:
    33
    Affiliation:
    Lightmatter

    Nicholas Harris

    Shining light through optical chips might be the fastest way for neural networks to make decisions.

    For decades physicists and engineers have dreamed of making optical chips that use photons, not electrons, to do computing. Such circuits could be lightning fast and energy efficient. But making them work has been difficult.

    In 2017, Nicholas Harris, together with Yichen Shen and other colleagues at MIT, published a widely cited paper describing a design that allowed them to calculate the outputs of neural networks that had been conventionally trained.

    The paper describes a circuit of 56 programmable interferometers—devices that carefully break apart and recombine light waves. The circuits they created solved a simplified problem of recognizing vowels correctly—distinguishing about three quarters of the 180 cases they tried. This wasn’t as good as a conventional computer, which got over 90% of them right. Shortly thereafter, Shen and Harris launched competing startups.

    Once a given neural network has been trained and implemented on an optical chip, performing inferences— figuring out which vowel corresponds to which sound, or how an autonomous car should react if a pedestrian steps into the street—can be almost as simple as shining light through it. This has the advantage of being both fast and energy efficient.

    In March 2021, Lightmatter announced it would soon start selling a “machine-learning accelerator” chip. “It’s just a completely different kind of computer,” says Harris. “Right now we’re at about a factor of 20 times more efficient than the most advanced node in digital computers.” Lightmatter closed a second round of funding in May, bringing its total investment to $113 million.

  • Age:
    32
    Affiliation:
    Lightelligence

    Yichen Shen

    Optical chips that can make calculations for neural networks are poised to become big business.

    There are two basic types of computations involving neural networks. First, the networks must be trained, which usually involves showing them lots of data, causing them to adjust the strength of the connections between their numerous “neurons.” Next, those existing connections are used to make decisions. It’s the difference between learning to drive and driving.

    The difference is crucial. If a neural network takes weeks to learn how to recognize images, that’s not necessarily a problem. But if it is driving an autonomous car, it needs to be able to make life-or-death inferences in fractions of a second.

    That’s where optical computers come in. Despite decades of research, they’ve never worked that well. It’s harder to manipulate photons than electrons. But for certain types of computations—like those commonly needed when using an existing neural network to make inferences—photons are just the thing.

    In 2017, Yichen Shen and Nicholas Harris published a widely cited paper on the use of optical circuits for machine-learning tasks including speech and image recognition. Their design, one review article notes, “represents a truly parallel implementation of one of the most crucial building blocks of neural networks using light, and modern foundries could easily mass-fabricate this type of photonic system.” This means that optical computers on a chip could become a huge business, with one in every device that uses a neural network to make decisions.

    Shen and Harris now run competing startups. Shen’s firm, Lightelligence, released a prototype optical AI chip in 2019, and Shen says they have secured over $100 million in funding.

  • Age:
    31
    Affiliation:
    Carnegie Mellon University

    Virginia Smith

    Her AI techniques are efficient and accurate while preserving fairness and privacy.

    When Virginia Smith began her PhD in artificial intelligence, she had a question: How do you train a neural network on data that is stored across multiple machines? 

    Her attempts to answer it have made her a leader in the field of federated learning, which seeks to handle data spread across hundreds, or even millions, of remote sources. 

    Google researchers first introduced federated learning in 2017 to use with the company’s mobile devices. The method they devised involved training millions of neural networks locally before sending them to a company server to be merged together in a master model. It allowed the master model to train on data from every device without making it necessary to centralize that data. This not only reduced latency in the mobile experience but could also improve each user’s data privacy.

    But combining millions of AI models also risks creating a central model that performs well on average but poorly for outliers—for example, voice recognition software that fails when the speaker has an unfamiliar accent.

    So Smith proposed a new technique for more “personalized” federated learning. Rather than merge a million localized models into one, it merges the most similar localized models into a few—the more heterogeneous the data, the greater the number of final models. Each model still learns from many devices but is also tailored to specific subsets of the user population.

    Smith also works to overcome other challenges in federated learning, such as accounting for different power and memory constraints on different devices. To encourage more research, she co-created an open-source tool that lets researchers test their federated techniques on more realistic data sets and in more realistic environments.

  • Age:
    33
    Affiliation:
    Argonne National Laboratory

    Jie Xu

    She makes durable, easy-to-manufacture polymer semiconductors for skin-like electronics.

    Jie Xu has made printable, stretchable electronics viable for mass production. Her multiple breakthroughs could be used in future wearable technology, advanced robotics, and human-computer interfaces with sensors connected to the skin.

    The key for Xu was inventing polymer circuits that kept working despite being flexed, stretched, and repeatedly moved. That had been a challenge for researchers until 2016, when Xu engineered a two-polymer coating applied to a rubbery surface that could be stretched to twice its size and still conduct electricity. 

    In 2019, she refined the technology so that her stretchable semiconductors could be mass-produced using roll-to-roll manufacturing, a common industrial fabrication process used to print anything from textiles to plastics on large rollers. It was the first time anyone had achieved such a feat at scale. 

    In the short term, Xu’s materials and manufacturing inventions can make flexible displays and skin-worn medical sensors much more practical and easy to make. Samsung Electronics has already patented two methods Xu helped define during collaboration with the company. Xu’s materials could also aid in the design of prosthetics with functional skin-like outer coverings. 

    Wary of adding yet more plastics into the world, Xu is searching for versions of the polymer semiconducting materials that are recyclable or biodegradable. “I think that kind of idea should be integrated from the very beginning of any commercial material,” she says.