Skip to Content
Uncategorized

Sands of Time

From the editor in chief

You hold in your hands the first special issue of Technology Review for 2000. It’s on a subject we think will increase in importance not just through this year but for the rest of the decade and perhaps for the rest of the new century. That subject: What happens after current silicon-based computing technologies begin to reach the limits of their rapid increase in speed?

For the last four decades, computers have presented a remarkable picture. While dramatically increasing in speed and computing power, they’ve also dropped precipitously in price. Underlying this pattern is a rule of thumb known as “Moore’s Law,” named for Intel co-founder Gordon Moore, who formulated it in the 1960s. Moore hypothesized that engineers would be able to squeeze more circuit elements into integrated circuits at a pace that represented a doubling every year or so.

The exponential growth in computing that Moore described underlies the growth of the Internet and the accompanying economic boom we’re now experiencing. Which brings up the unsettling question of what happens when Moore’s Law runs out of gas. After all, it’s not a law of nature. It’s just a rule of thumb describing what’s happened in one industry-computer manufacturing-over a couple of decades. Nothing says it’s eternal. Indeed, there have been plenty of cries of alarm before about the end of Moore’s Law, but, as Charles Mann points out in his introductory article (“The End of Moore’s Law?”), this time there is ample reason to take the alarms seriously.

If silicon-based computing technology reaches its limits in the next decade, what is waiting in the labs to take its place? That’s the question this special issue takes on. In four articles on new approaches to computing- Molecular Computing, Quantum Computing, Biological Computing, and DNA Computing-the issue outlines how the process of computation can be divorced from silicon and embodied in new mediums.

None of these approaches is ready to serve as an all-purpose replacement for silicon. In fact, one or more may never be more than specialized methods applied in particular niches, such as high-level cryptography. Which raises the question of why major corporations would invest money, time and energy in researching such high-risk propositions. Robert Buderi gives some surprising answers to that question and follows it up with an interview with Carly Fiorina, the new CEO of Hewlett-Packard-a company that is taking molecular computing very seriously.

Although all of these new approaches are high-risk research ventures, one of them, or one of their technological descendants, might one day turn out to be as revolutionary as integrated circuits incised on silicon chips. We won’t know which one for some time, since it takes a good long while for a profound new technology to work its way out of the laboratory and transform the economy. But if you want to stay ahead of the curve, you can’t wait until the results of that new technology are evident to all. You have to start looking much earlier, when that revolutionary new technology is being born in cutting-edge laboratories, academic and corporate. For those who want early warning of the next computing technology with the potential to be as revolutionary as silicon, the time to pay attention is now. And this issue is the place to start.

Keep Reading

Most Popular

Large language models can do jaw-dropping things. But nobody knows exactly why.

And that's a problem. Figuring it out is one of the biggest scientific puzzles of our time and a crucial step towards controlling more powerful future models.

How scientists traced a mysterious covid case back to six toilets

When wastewater surveillance turns into a hunt for a single infected individual, the ethics get tricky.

The problem with plug-in hybrids? Their drivers.

Plug-in hybrids are often sold as a transition to EVs, but new data from Europe shows we’re still underestimating the emissions they produce.

It’s time to retire the term “user”

The proliferation of AI means we need a new word.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.