Better Architecture
Computer architectures aren’t laws of physics. They’re man-made inventions designed to harness raw resources, such as billions of transistors, for a range of useful computational tasks.

When our computing needs and tasks change—as they inevitably will over the decades—it becomes increasingly awkward to express programs through the original architecture. And yet that’s where we find ourselves—adhering to an ossified architecture that imposes constraints and slows our technological progress.
Today’s architectures are more than half a century old. In the 1940s, electronic computers became reprogrammable, with data and instructions (a.k.a. software) stored in memory and passed to a central processing unit (CPU) for computation. This architecture evolved slightly over time but remained fundamentally the same.
The vast majority of computing devices today are connected to the Internet, making them vulnerable to remote attack. Our data centers demand the type of strong security—including isolation and tracking of data—that classic architectures were never designed to support.
That’s one reason computing architectures must evolve. A system being developed by Hewlett-Packard, known as the Machine (see “Machine Dreams”), uses electronic components called memristors to store and process information—offering more powerful ways to handle large amounts of data—together with silicon photonic components that allow data to be transported at very high speeds using light. HP’s researchers are also developing a new operating system, Machine OS, to make the most of this new architecture.
Reinvention like this doesn’t solve all our problems. In some cases it creates new ones. The consistent architecture of IBM’s System 360 in the 1960s and 1970s ensured that buyers of early models could upgrade their machines and feel confident that the programs they were already using would continue to work. Can a new architecture evolve without forcing every program to evolve with it?
Probably. Since the days of the System 360, compilers and program translators—tools that allow software to run on different architectures—have matured substantially. We’ll need to make the most of such tools if we hope to loosen our ties to legacy architectures and allow computers like the Machine to thrive.
Martha Kim is an associate professor of computer science at Columbia University.
Keep Reading
Most Popular
Geoffrey Hinton tells us why he’s now scared of the tech he helped build
“I have suddenly switched my views on whether these things are going to be more intelligent than us.”
ChatGPT is going to change education, not destroy it
The narrative around cheating students doesn’t tell the whole story. Meet the teachers who think generative AI could actually make learning better.
Meet the people who use Notion to plan their whole lives
The workplace tool’s appeal extends far beyond organizing work projects. Many users find it’s just as useful for managing their free time.
Learning to code isn’t enough
Historically, learn-to-code efforts have provided opportunities for the few, but new efforts are aiming to be inclusive.
Stay connected
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.