In less than a century, computing has transformed our society and helped spur countless innovations. We now carry in our back pockets computers that we could only have dreamed of a few decades ago. Machine-learning systems can analyze scenes and drive vehicles. And we can craft extraordinarily accurate representations of the real world—models that can be used to design nuclear reactors, simulate myriad greenhouse-gas emission scenarios, and launch a probe on a nine-year trip to study Pluto in an all-too-brief high-speed fly-by.
We fundamentally owe these capabilities to our ability to build progressively better computing devices—the transistors and other components at the heart of computer chips. But the transistor is reaching its limits, along with the traditional von Neumann architecture—the system of separate logic and memory that we use to construct computers. If we want to keep improving computer performance and energy efficiency, it’s time for some fresh ideas.
There are, of course, plenty of possibilities at hand: quantum computers, optoelectronic components made from two-dimensional materials, and analog circuitry are just a few. Many of these approaches have been discussed for years, if not decades.
But some are now reaching promising levels of maturity. In my research and that of 35 Innovators awardee Xu Zhang at Carnegie Mellon University, for example, 2D semiconductors are making their way into optoelectronic devices—the sort used in telecommunication. These devices have started to surpass the performance of conventional switches made with silicon and III-V semiconductors (compounds with elements from columns III and V on the periodic table).
This essay is part of MIT Technology Review’s 2022 Innovators Under 35 package recognizing the most promising young people working in technology today. See the full list here or explore the winners in this category below.
Optical computing, an early approach that was later abandoned in favor of binary electronic circuitry, is also moving forward. I am fascinated by the possibility of building computers that use light as the “working fluid,” passing photons around much the way our present chips do electrons.
This is already happening: silicon photonic chips are providing high energy efficiency and are helping overcome the slowdown issues in traditional GPU architectures. They can reduce the time needed to train deep-learning models, enabling the next generation of advanced AI. There are opportunities to integrate photonics with new low-power chip designs like those from TR35 awardee Hongjie Liu at Reexen Technology.
In the long term, such photonic circuits could help us approach or perhaps even surpass widely accepted limits in computing. Theoretical work in photonic information processing suggests that light can be converted to heat and vice versa, which opens up some remarkable opportunities for all-optical energy storage—essentially batteries made out of photons—and alternative computing architectures.
Many of these projects are still happening primarily in the academic realm, but we are slowly moving toward building larger-scale, more fully integrated systems. If we can continue thinking about how these ideas can be integrated into full computing systems, the coming years should see even more progress away from traditional chips and toward an array of different forms of computing.
Starting in July, Prineha Narang is the Howard Reiss Chair Professor in Physical Sciences at University of California, Los Angeles (and was a 35 Innovators honoree in 2018).
Everything dies, including information
Digitization can help stem the tide of entropy, but it won’t stop it.
What’s next in cybersecurity
“When it comes to really cutting off ransomware from the source, I think we took a step back.”
Cyber resilience melds data security and protection
Organizations face pervasive and sophisticated cyberattacks, but modern data protection techniques can provide a multifaceted defense.
A new age of disaster recovery planning for SMEs
How cybersecurity threats have morphed, why SMEs need to plan for disaster recovery, and what they should do about it.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.