MIT Technology Review Subscribe

Better Architecture

Computers are overdue for the fundamental changes they could soon get.

Computer architectures aren’t laws of physics. They’re man-made inventions designed to harness raw resources, such as billions of transistors, for a range of useful computational tasks. 

Martha Kim
Martha Kim

When our computing needs and tasks change—as they inevitably will over the decades—it becomes increasingly awkward to express programs through the original architecture. And yet that’s where we find ourselves—adhering to an ossified architecture that imposes constraints and slows our technological progress.

Advertisement

Today’s architectures are more than half a century old. In the 1940s, electronic computers became reprogrammable, with data and instructions (a.k.a. software) stored in memory and passed to a central processing unit (CPU) for computation. This architecture evolved slightly over time but remained fundamentally the same.

This story is only available to subscribers.

Don’t settle for half the story.
Get paywall-free access to technology news for the here and now.

Subscribe now Already a subscriber? Sign in
You’ve read all your free stories.

MIT Technology Review provides an intelligent and independent filter for the flood of information about technology.

Subscribe now Already a subscriber? Sign in

The vast majority of computing devices today are connected to the Internet, making them vulnerable to remote attack. Our data centers demand the type of strong security—including isolation and tracking of data—that classic architectures were never designed to support.

That’s one reason computing architectures must evolve. A system being developed by Hewlett-Packard, known as the Machine (see “Machine Dreams”), uses electronic components called memristors to store and process information—offering more powerful ways to handle large amounts of data—together with silicon photonic components that allow data to be transported at very high speeds using light. HP’s researchers are also developing a new operating system, Machine OS, to make the most of this new architecture.

Reinvention like this doesn’t solve all our problems. In some cases it creates new ones. The consistent architecture of IBM’s System 360 in the 1960s and 1970s ensured that buyers of early models could upgrade their machines and feel confident that the programs they were already using would continue to work. Can a new architecture evolve without forcing every program to evolve with it?

Probably. Since the days of the System 360, compilers and program translators—tools that allow software to run on different architectures—have matured substantially. We’ll need to make the most of such tools if we hope to loosen our ties to legacy architectures and allow computers like the Machine to thrive.

Martha Kim is an associate professor of computer science at Columbia University.

This is your last free story.
Sign in Subscribe now

Your daily newsletter about what’s up in emerging technology from MIT Technology Review.

Please, enter a valid email.
Privacy Policy
Submitting...
There was an error submitting the request.
Thanks for signing up!

Our most popular stories

Advertisement