Skip to Content

Better Architecture

Computers are overdue for the fundamental changes they could soon get.
April 21, 2015

Computer architectures aren’t laws of physics. They’re man-made inventions designed to harness raw resources, such as billions of transistors, for a range of useful computational tasks. 

Martha Kim
Martha Kim

When our computing needs and tasks change—as they inevitably will over the decades—it becomes increasingly awkward to express programs through the original architecture. And yet that’s where we find ourselves—adhering to an ossified architecture that imposes constraints and slows our technological progress.

Today’s architectures are more than half a century old. In the 1940s, electronic computers became reprogrammable, with data and instructions (a.k.a. software) stored in memory and passed to a central processing unit (CPU) for computation. This architecture evolved slightly over time but remained fundamentally the same.

The vast majority of computing devices today are connected to the Internet, making them vulnerable to remote attack. Our data centers demand the type of strong security—including isolation and tracking of data—that classic architectures were never designed to support.

That’s one reason computing architectures must evolve. A system being developed by Hewlett-Packard, known as the Machine (see “Machine Dreams”), uses electronic components called memristors to store and process information—offering more powerful ways to handle large amounts of data—together with silicon photonic components that allow data to be transported at very high speeds using light. HP’s researchers are also developing a new operating system, Machine OS, to make the most of this new architecture.

Reinvention like this doesn’t solve all our problems. In some cases it creates new ones. The consistent architecture of IBM’s System 360 in the 1960s and 1970s ensured that buyers of early models could upgrade their machines and feel confident that the programs they were already using would continue to work. Can a new architecture evolve without forcing every program to evolve with it?

Probably. Since the days of the System 360, compilers and program translators—tools that allow software to run on different architectures—have matured substantially. We’ll need to make the most of such tools if we hope to loosen our ties to legacy architectures and allow computers like the Machine to thrive.

Martha Kim is an associate professor of computer science at Columbia University.

Keep Reading

Most Popular

This startup wants to copy you into an embryo for organ harvesting

With plans to create realistic synthetic embryos, grown in jars, Renewal Bio is on a journey to the horizon of science and ethics.

VR is as good as psychedelics at helping people reach transcendence

On key metrics, a VR experience elicited a response indistinguishable from subjects who took medium doses of LSD or magic mushrooms.

This nanoparticle could be the key to a universal covid vaccine

Ending the covid pandemic might well require a vaccine that protects against any new strains. Researchers may have found a strategy that will work.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.