Skip to Content

10 Breakthrough Technologies 2004

Emerging Technologies: 2004

Technology Review unveils its annual selection of hot new technologies about to affect our lives in revolutionary ways-and profiles the innovators behind them.
February 1, 2004

With new technologies constantly being invented in universities and companies across the globe, guessing which ones will transform computing, medicine, communication, and our energy infrastructure is always a challenge. Nonetheless, Technology Review’s editors are willing to bet that the 10 emerging technologies highlighted in this special package will affect our lives and work in revolutionary ways-whether next year or next decade. For each, we’ve identified a researcher whose ideas and efforts both epitomize and reinvent his or her field. The following snapshots of the innovators and their work provide a glimpse of the future these evolving technologies may provide.

Magazine
10 Emerging Technologies That Will Change Your World

This story was part of our February 2004 issue.

Explore the issue

10 Breakthrough Technologies

  • Universal Translation

    Yuqing Gao is bilingual-and so is her computer. At IBM’s Watson Research Center in Yorktown Heights, NY, the computer scientist, role-playing a doctor, speaks Mandarin Chinese into a personal digital assistant. In a few seconds, a pleasant female voice emanating from the device asks, in English, “What are your symptoms?” Gao’s system, designed to help doctors communicate with patients, can be extended to other languages and situations. The ultimate goal, she says, is to develop “universal translation” software that gleans meaning from phrases in one language and conveys it in any other language, enabling people from different cultures to communicate.

    Gao’s work is at the forefront of escalating efforts to use mathematical models and natural-language-processing techniques to make computerized translation more accurate and efficient, and more adaptable to new languages. Distinct from speech recognition and synthesis, the technology behind universal translation has matured in recent years, driven in part by global business and security needs. “Advances in automatic learning, computing power, and available data for translation are greater than we’ve seen in the history of computer science,” says Alex Waibel, associate director of Carnegie Mellon University’s Language Technologies Institute, which supports several parallel efforts in the field.

  • Synthetic Biology

    Perched on the gently sloping hills of Princeton University’s brick and ivy campus, Ron Weiss’s biology laboratory is stocked with the usual array of microscopes, pipettes, and petri dishes. Less typical is its location: crammed into the Engineering Quadrangle, it stands out among the electrical and mechanical engineering labs. Yet it’s an appropriate spot for Weiss. A computer engineer by training, he discovered the allure of biology during graduate school-when he began programming cells instead of computers. In fact, he began to program cells as if they were computers.

    Weiss is one of just a handful of researchers delving into the inchoate field of synthetic biology, assiduously assembling genes into networks designed to direct cells to perform almost any task their programmers conceive. Combined with simple bacteria, these networks could advance biosensing, allowing inspectors to pinpoint land mines or biological weapons; add human cells, and researchers might build entire organs for transplantation. “We want to create a set of biological components, DNA cassettes that are as easy to snap together, and as likely to function, as a set of Legos,” says Tom Knight, an MIT computer-engineer-cum-biologist, and the graduate advisor who turned Weiss on to the idea.

  • Nanowires

    Few emerging technologies have offered as much promise as nanotechnology, touted as the means of keeping the decades-long electronics shrinkfest in full sprint and transfiguring disciplines from power production to medical diagnostics. Companies from Samsung Electronics to Wilson Sporting Goods have invested in nanotech, and nearly every major university boasts a nanotechnology initiative. Red hot, even within this R&D frenzy, are the researchers learning to make the nanoscale wires that could be key elements in many working nanodevices.

    “This effort is critical for the success of the whole [enterprise of] nanoscale science and technology,” says nanowire pioneer Peidong Yang of the University of California, Berkeley. Yang has made exceptional progress in fine-tuning the properties of nanowires. Compared to other nanostructures, “nanowires will be much more versatile, because we can achieve so many different properties just by varying the composition,” says Charles Lieber, a Harvard University chemist who has also been propelling nanowire development.

  • Bayesian Machine Learning

    When a computer scientist publishes genetics papers, you might think it would raise colleagues’ eyebrows. But Daphne Koller’s research using a once obscure branch of probability theory called Bayesian statistics is generating more excitement than skepticism. The Stanford University associate professor is creating programs that, while tackling questions such as how genes function, are also illuminating deeper truths about the long-standing computer science conundrum of uncertainty-learning patterns, finding causal relationships, and making predictions based on inevitably incomplete knowledge of the real world. Such methods promise to advance the fields of foreign-language translation, microchip manufacturing, and drug discovery, among others, sparking a surge of interest from Intel, Microsoft, Google, and other leading companies and universities.

    How does an idea conceived by an 18th-century minister (Thomas Bayes) help modern computer science? Unlike older approaches to machine reasoning, in which each causal connection (“rain makes grass wet”) had to be explicitly taught, programs based on probabilistic approaches like Bayesian math can take a large body of data (“it’s raining,” “the grass is wet”) and deduce likely relationships, or “dependencies,” on their own. That’s crucial because many decisions programmers would like to automate-say, personalizing search engine results according to a user’s past queries-can’t be planned in advance; they require machines to weigh unforeseen combinations of evidence and make their best guesses. Says Intel research director David Tennenhouse, “These techniques are going to impact everything we do with computers-from user interfaces to sensor data processing to data mining.”

  • T-Rays

    With the human eye responsive to only a narrow slice of the electromagnetic spectrum, people have long sought ways to see beyond the limits of visible light. X-rays illuminate the ghostly shadows of bones, ultraviolet light makes certain chemicals shine, and near-infrared radiation provides night vision. Now researchers are working to open a new part of the spectrum: terahertz radiation, or t-rays. Able to easily penetrate many common materials without the medical risks of x-rays, t-rays promise to transform fields like airport security and medical imaging, revealing not only the shape but also the composition of hidden objects, from explosives to cancers.

    In the late 1990s, Don Arnone and his group at Toshiba’s research labs in Cambridge, England, were eyeing t-rays as an alternative to dental x-rays. The idea was that t-rays, operating in the deep-infrared region just before wavelengths stretch into microwaves, would be able to spot decay without harmful ionizing radiation. In tests, the researchers fired powerful but extremely short pulses of laser light at a semiconductor chip, producing terahertz radiation (so called because it has frequencies of trillions of waves per second). Passing through gaps or different thicknesses of material changes the rays’ flight time, so by measuring how long each t-ray took to pass through an extracted tooth and reach a detector, the researchers were able to assemble a 3-D picture of the tooth.

  • Distributed Storage

    Whether it’s organizing documents, spreadsheets, music, photos, and videos or maintaining regular backup files in case of theft or a crash, taking care of data is one of the biggest hassles facing any computer user. Wouldn’t it be better to store data in the nooks and crannies of the Internet, a few keystrokes away from any computer, anywhere? A budding technology known as distributed storage could do just that, transforming data storage for individuals and companies by making digital files easier to maintain and access while eliminating the threat of catastrophes that obliterate information, from blackouts to hard-drive failures.

    Hari Balakrishnan is pursuing this dream, working to free important data from dependency on specific computers or systems. Music-sharing services such as KaZaA, which let people download and trade songs from Internet-connected PCs, are basic distributed-storage systems. But Balakrishnan, an MIT computer scientist, is part of a coalition of programmers who want to extend the concept to all types of data. The beauty of such a system, he says, is that it would provide all-purpose protection and convenience without being complicated to use. “You can now move [files] across machines,” he says. “You can replicate them, remove them, and the way in which [you] get them is unchanged.” With inability to access data sometimes costing companies millions in revenue per hour of downtime, according to Stamford, CT-based Meta Group, a distributed-storage system could dramatically enhance productivity.

  • RNAi Therapy

    From heart disease to hepatitis, cancer to AIDS, a host of modern ailments are triggered by our own errant genes-or by those of invading organisms. So if a simple technique could be found for turning off specific genes at will, these diseases could-in theory-be arrested or cured. Biochemist Thomas Tuschl may have found just such an off switch in humans: RNA interference (RNAi). While working at Germany’s Max Planck Institute for Biophysical Chemistry, Tuschl discovered that tiny double-stranded molecules of RNA designed to target a certain gene can, when introduced into human cells, specifically block that gene’s effects.

    Tuschl, now at Rockefeller University in New York City, first presented his findings at a meeting in Tokyo in May 2001. His audience was filled with doubters who remembered other much hyped RNA techniques that ultimately didn’t work very well. “They were very skeptical and very critical,” recalls Tuschl. What the skeptics didn’t realize was that RNAi is much more potent and reliable than earlier methods. “It worked the first time we did the experiment,” Tuschl recalls. Within a year, the doubts had vanished, and now the technique has universal acceptance-spawning research at every major drug company and university and likely putting Tuschl on the short list for a Nobel Prize.

  • Power Grid Control

    Power grids carry the seeds of their own destruction: massive flows of electricity that can race out of control in just seconds, threatening to melt the very lines that carry them. Built in the days before quick-reacting microprocessors and fiber optics, these networks were never designed to detect and squelch systemwide disturbances. Instead, each transmission line and power plant must fend for itself, shutting down when power flows spike or sag. The shortcomings of this system are all too familiar to the 50 million North Americans from Michigan to Ontario whose lights went out last August: as individual components sense trouble and shut down, the remaining power flows become even more disturbed, and neighboring lines and plants fall like multimillion-dollar dominoes. Often-needless shutdowns result, costing billions, and the problem is only expected to get worse as expanding economies push more power onto grids.

    Christian Rehtanz thinks the time has come for modern control technology to take back the grid. Rehtanz, group assistant vice president for power systems technology with Zrich, Switzerland-based engineering giant ABB, is one of a growing number of researchers seeking to build new smarts into grid control rooms. These engineers are developing hardware and software to track electric flows across continent-wide grids several times a second, identify disturbances, and take immediate action. While such “wide area” control systems remain largely theoretical, Rehtanz and his ABB colleagues have fashioned one that is ready for installation today. If their design works as advertised, it will make power outages 100 times less likely, protecting grids against everything from consumption-inducing heat waves to terrorism. “We can push more power through the grid while, at the same time, making the system more predictable and more reliable,” says Rehtanz.

  • Microfluidic Optical Fibers

    The blazing-fast Internet access of the future-imagine downloading movies in seconds-might just depend on a little plumbing in the network. Tiny droplets of fluid inside fiber-optic channels could improve the flow of data-carrying photons, speeding transmission and improving reliability. Realizing this radical idea is the goal of University of Illinois physicist John Rogers, whose prototype devices, called microfluidic optical fibers, may be the key to superfast delivery of everything from e-mail to Web-based computer programs, once “bandwidth” again becomes the mantra.

    Rogers began exploring fluid-filled fibers more than two years ago as a researcher at Lucent Technologies’ Bell Labs. While the optical fibers that carry today’s phone and data transmissions consist of glass tubing that is flexible but solid, Rogers employs fibers bored through with microscopic channels, ranging from one to 300 micrometers in diameter, depending on their use. While Rogers didn’t invent the fibers, he and his team showed that pumping tiny amounts of various fluids into them-and then controlling the expansion, contraction, and movement of these liquid “plugs”-causes the optical properties of the fibers to change. Structures such as tiny heating coils printed directly on the fiber precisely control the size, shape, and position of the plugs. Modifying the plugs’ properties enables them to perform critical functions, such as correcting error-causing distortions and directing data flows more efficiently, thus boosting bandwidth far more cheaply than is possible today.

  • Personal Genomics

    Three billion. That’s the approximate number of DNA “letters” in each person’s genome. The Human Genome Project managed a complete, letter-by-letter sequence of a model human-a boon for research. But examining the specific genetic material of each patient in a doctor’s office by wading through those three billion letters just isn’t practical. So to achieve the dream of personalized medicine-a future in which a simple blood test will determine the best course of treatment based on a patient’s genes-many scientists are taking a shortcut: focusing on only the differences between people’s genomes.

    David Cox, chief scientific officer of Perlegen Sciences in Mountain View, CA, is turning that strategy into a practical tool that will enable doctors and drug researchers to quickly determine whether a patient’s genetic makeup results in greater vulnerability to a particular disease, or makes him or her a suitable candidate for a specific drug. Such tests could eventually revolutionize the treatment of cancer, Alzheimer’s, asthma-almost any disease imaginable. And Cox, working with some of the world’s leading pharmaceutical companies, has gotten an aggressive head start in making it happen.