Select your localized edition:

Close ×

More Ways to Connect

Discover one of our 28 local entrepreneurial communities »

Be the first to know as we launch in new countries and markets around the globe.

Interested in bringing MIT Technology Review to your local market?

MIT Technology ReviewMIT Technology Review - logo

 

Unsupported browser: Your browser does not meet modern web standards. See how it scores »

In a lab at Philips Electronics in the Netherlands, researchers are stalking the solution to one of the great problems of modern life: having to hunt through hundreds of television channels for something you’d like to watch. The lab’s answer is a TV that recognizes you when you walk into the room, knows you like occult thrillers, finds one it recorded at three in the morning, and puts it up on the screen. Alongside will be smaller images of a British news report on the company you just invested in, the Web page carrying the eBay auction you bid in, and the high-resolution video scene you recorded on your cell phone earlier in the day. Ready to switch channels? Just speak up and tell the TV what you want.

Perhaps the best thing about this talented device is that you’ll be able to buy it in about seven years for about what you’d pay for a dumb television today. Philips has already demonstrated these sorts of capabilities in its lab and recently rolled out a semi-intelligent prototype. “We can already produce a mostly digital television that allows you to add functions through software and that will cost in the ballpark of a conventional analog set,” says Theo Claasen, chief technology officer for the company’s semiconductor group.

We’ve come to take for granted that the electronics industry keeps hurling new and improved products at us, and it’s a solid bet that this won’t slow down in the near future. Electronic products are largely defined by the microprocessors inside them, and the power and speed of these chips continue to climb exponentially. The amazing resiliency of Moore’s Law-Intel cofounder Gordon Moore’s prediction nearly 40 years ago that the number of transistors on a chip would double every year-means that chips have gone from having a few thousand transistors three decades ago to over 100 million today, while the price per transistor has dropped from $1 to a millionth of a cent. And since transistor density roughly translates to computing and communications speed, you can thank Moore’s Law for innovations like online shopping, in-car navigation systems, and cheap cell phones. “Transistors are free,” says Krishnamurthy Soumyanath, director of communications-circuits research at Intel. “We can solve problems by throwing more transistors at them.”

Despite skeptics’ perennial warnings that Moore’s Law will peter out, the industry is set to hew to it for at least the next three generations of microprocessors, expected to come out over the next six years. Right now the smallest standard features of the fastest silicon transistors are 90 nanometers wide. Before the end of 2005, manufacturers expect to make 65-nanometer transistors. And blueprints for reducing that to 45 nanometers by 2007 are in the works.

Miniaturization means that more transistors can be squeezed onto a chip. This makes microprocessors faster, in part because electrons have less distance to travel between transistors. It also makes memory chips more capacious. Today, the fastest consumer microprocessors have about 180 million transistors and operate at a speed of about three gigahertz-or roughly speaking, three billion simple operations per second-while the adjacent random-access memory chips hold two gigabytes of data or more. By 2007, processors will pack more than a billion transistors, hit speeds approaching 10 gigahertz, and be backed up by several gigabytes of RAM. With that kind of power and memory, PCs will be able to transport you to ultrarealistic online virtual worlds, hold up their end of a conversation (on certain topics, anyway), and quickly search through hours of your vacation videos for that bit where Uncle Arnold capsizes his canoe.

Predicting what other sorts of gadgets will result from this explosion in computing power is, of course, the $64,000-make that the $64 billion-question. For all his prescience about chips, Gordon Moore himself failed to foresee the PC or the Internet, never mind the personal digital assistant or smart cell phone. Home videophones and pen-based computers, on the other hand, have managed to stay off consumers’ radar screens despite decades of hype. “If ten years ago someone told you about the World Wide Web, MP3 players, and video cameras that fit in the palm of your hand, you wouldn’t have believed them,” says Jeffrey Bokor, a professor in the Department of Electrical Engineering and Computer Science at the University of California, Berkeley. “What we’re going to see over the coming years will be equally hard to imagine.”

Pages

0 comments about this story. Start the discussion »

Tagged: Computing

Reprints and Permissions | Send feedback to the editor

From the Archives

Close

Introducing MIT Technology Review Insider.

Already a Magazine subscriber?

You're automatically an Insider. It's easy to activate or upgrade your account.

Activate Your Account

Become an Insider

It's the new way to subscribe. Get even more of the tech news, research, and discoveries you crave.

Sign Up

Learn More

Find out why MIT Technology Review Insider is for you and explore your options.

Show Me