Skip to Content

Better Architecture

Computers are overdue for the fundamental changes they could soon get.
April 21, 2015

Computer architectures aren’t laws of physics. They’re man-made inventions designed to harness raw resources, such as billions of transistors, for a range of useful computational tasks. 

Martha Kim
Martha Kim

When our computing needs and tasks change—as they inevitably will over the decades—it becomes increasingly awkward to express programs through the original architecture. And yet that’s where we find ourselves—adhering to an ossified architecture that imposes constraints and slows our technological progress.

Today’s architectures are more than half a century old. In the 1940s, electronic computers became reprogrammable, with data and instructions (a.k.a. software) stored in memory and passed to a central processing unit (CPU) for computation. This architecture evolved slightly over time but remained fundamentally the same.

The vast majority of computing devices today are connected to the Internet, making them vulnerable to remote attack. Our data centers demand the type of strong security—including isolation and tracking of data—that classic architectures were never designed to support.

That’s one reason computing architectures must evolve. A system being developed by Hewlett-Packard, known as the Machine (see “Machine Dreams”), uses electronic components called memristors to store and process information—offering more powerful ways to handle large amounts of data—together with silicon photonic components that allow data to be transported at very high speeds using light. HP’s researchers are also developing a new operating system, Machine OS, to make the most of this new architecture.

Reinvention like this doesn’t solve all our problems. In some cases it creates new ones. The consistent architecture of IBM’s System 360 in the 1960s and 1970s ensured that buyers of early models could upgrade their machines and feel confident that the programs they were already using would continue to work. Can a new architecture evolve without forcing every program to evolve with it?

Probably. Since the days of the System 360, compilers and program translators—tools that allow software to run on different architectures—have matured substantially. We’ll need to make the most of such tools if we hope to loosen our ties to legacy architectures and allow computers like the Machine to thrive.

Martha Kim is an associate professor of computer science at Columbia University.

Keep Reading

Most Popular

Large language models can do jaw-dropping things. But nobody knows exactly why.

And that's a problem. Figuring it out is one of the biggest scientific puzzles of our time and a crucial step towards controlling more powerful future models.

The problem with plug-in hybrids? Their drivers.

Plug-in hybrids are often sold as a transition to EVs, but new data from Europe shows we’re still underestimating the emissions they produce.

Google DeepMind’s new generative model makes Super Mario–like games from scratch

Genie learns how to control games by watching hours and hours of video. It could help train next-gen robots too.

How scientists traced a mysterious covid case back to six toilets

When wastewater surveillance turns into a hunt for a single infected individual, the ethics get tricky.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.