Skip to Content

R&D 2005

Technology Review’s annual look at research trends in corporations is led this year by pharmaceutical and biotech companies.
September 1, 2005

The 2005 edition of the TR R&D Scorecard ( pdf ) shows that worldwide corporate spending is picking up ( Big Spenders ), but that the gains are unevenly distributed ( Where the Growth Is ). The biggest advances are in the life sciences, which also happen to be among the most research-intensive industries ( Innovation Sectors ): 2004 R&D spending among the biotech companies on the list shot up by an average of 69 percent over the previous year.

The gain at pharmaceutical companies was less spectacular but still a strong 22 percent. IT companies, on the other hand, have as a group barely increased their R&D outlays; telecommunications and computer hardware companies, on average, spent less than in 2003. Spending in telecom remains particularly troubled, with several leading companies, including Motorola, Ericsson, and NTT, reporting double-digit decreases. In IT, however, software remains an exception; Microsoft paced the sector to a 20 percent increase in research spending in 2004.

The scorecard ranks companies by the Technology Review Innovation Index ( TR Innovation Index - Top 15 ), which takes into account R&D spending levels, spending increases, and R&D as a proportion of sales; five of the top 10 companies according to this metric are in life sciences.

But numbers alone don’t tell the corporate research story. Another indicator of vibrant R&D is willingness to invest in visionary projects that may not pay off for many years–if ever. In this spirit, we spotlight three “blue sky” research efforts.

Intel’s use of lasers to detect biological molecules with exquisite sensitivity could help researchers understand the causes of cancer and other diseases. Lucent Technologies’ Bell Labs–which has in the past decade severely curtailed the basic research that once made it such a jewel–is making progress toward the radical concept of quantum computing. And IBM has launched an effort to use supercomputers to model the human brain. These projects provide a heartening counterweight to the common charge that industry is overly fixated on next quarter’s results. – Edited by Herb Brody

The Computer Brain
By David Talbot

The neocortex constitutes the bulk of the human brain and is the presumed seat of learning, language, memory, and whatever it means to be human. It contains many billions of neurons, and each neuron can interact with nearby neurons in thousands of different ways. The operations of even a single neuron are difficult to measure, and biologists don’t agree on how many distinct subclasses of neurons are present in the neocortex, how the six layers of the neocortex interact with one another, and whether the system behaves differently from one part of the neocortex to the next.

“It’s a humongous mess,” says Michael Beierlein, a neuroscientist at Harvard Medical School. And when neuroscientists study the electrochemical processes that take place in that mess, “ultimately we just don’t know what the crucial features are, and which ones we can safely ignore: what is biological noise, what is important, what is an experimental artifact.”

Neuroscientists around the world are trying to decipher the neocortex, because understanding it better could provide insights into everything from psychiatric disorders and brain disease to learning and memory. To that end, many groups are trying to create computer models of how neurons function. A research project launched this year by IBM is the most ambitious such effort ever attempted: the company and Swiss research partners hope to create a functioning 3-D model of a two-millimeter chunk of neocortex containing 60,000 neurons–a unit known as a neocortical column.

The neuron modeling project “is going to be larger than anything done before, by an order of magnitude,” says Charles Peck, the computer scientist at IBM’s T. J. Watson Research Center in Yorktown Heights, NY, who heads the project, dubbed “Blue Brain.”

The researchers will take raw data collected from rat neurons at the Swiss Federal Institute of Technology in Lausanne and feed it into an IBM supercomputer that is among the world’s fastest. Henry Markram, the Swiss neuroscientist heading the biological end of the project, says a graphical representation of just the 10,000 neurons in a rat neocortical column will require up to two terabytes of storage–roughly the amount of data that can be held in 400 standard recordable DVDs. IBM computer scientists experienced in simulating biological systems will help build a 3-D model that mimics the interactions of these neurons and compare its performance against Markram”s laboratory data.

The job will be vast. “Think of a neuron as a tree, with roots and branches,” says Markram. “Imagine if you take 60,000 of these trees and squeeze them in the space of a pinhead. That is the kind of architecture you are looking at, with the roots of trees touching branches of other trees.” And that’s just for one neocortical column; the human neocortex is estimated to contain tens of millions of them. But if all goes well, “we will be able to see where the information goes, how it is represented, and how it is stored on a tree,” Markram says. “Then we can understand what can go wrong.” Markram believes the project could yield possible targets for drugs to treat brain diseases in 10 years.

That is certainly ambitious. “The simulation may lead to a better understanding of some of the circuitry,” says Tai Sing Lee, a computer scientist and neurophysicist at the Center for the Neural Basis of Cognition, a joint project of the University of Pittsburgh and Carnegie Mellon University. However, he adds, “Simulating the human brain and curing disease are extremely far away.” Viewed against the magnitude of the task, says Lee, IBM’s Blue Brain project is worthwhile but “a small step in biology.”

Precision Biology
By Claire Tristram

Ever since James Watson and Francis Crick unveiled their helical model of DNA in 1953, it has been an iconographic symbol of science. But no matter how familiar the structure of DNA becomes, observing the molecular pieces from which it is built remains a tantalizing challenge–and one for which a number of competing technologies are being developed. A tool that consistently offers researchers a way to observe biological processes at the molecular level would be invaluable. In particular, the ability to closely observe the nucleotides that make up DNA, combined with the ongoing work on the human genome, could eventually yield more-powerful methods for diagnosing disease.

At Intel, technologists pursuing better biological imaging have adopted an analytical method widely used in semiconductor R&D. In May, Intel’s Precision Biology group published a paper describing its use of Raman spectroscopy to detect single molecules of two of the four nucleotides that make up DNA: deoxyguanosine monophosphate (dGMP) and deoxyadenosine monophosphate (dAMP). While single molecules of dAMP had previously been detected with Raman spectroscopy, dGMP molecules had not. And Intel”s approach greatly improved the consistency with which a Raman effect was detected. “We wanted to push the limits of sensitivity,” says Andrew Berlin, lead researcher for the five-year-old group.

Raman spectroscopy takes advantage of the fact that light beams passing through different substances will scatter in different ways, emerging with different sets of characteristic wavelengths. Such patterns can serve as fingerprints for identifying specific compounds. The Raman approach offers advantages over other technologies for single-molecule detection, in that it”s one of the most sensitive techniques available and can also be used to detect molecules in a very dilute solution of water–or potentially in the watery world of a cell. What”s more, the technique provides a way to directly observe molecules without labeling them with fluorescent tags.

One way to intensify the Raman effect is to induce it in close proximity to metal. Berlin”s team, adapting techniques already being used by Intel in its manufacturing processes, first created a layer of silicon that was pocked with nanoscale pores to increase the area of the surface to which molecules could bind. They next coated the silicon with molecules containing silver and deposited a biological sample on the coated surface. The group bombarded the sample with pulses from multiple lasers and, in recent experiments, caused a single nucleotide to emit a signal strong enough to be detected. “We’re right in the middle of one of the best labs in the world for optimizing nanoparts, so we could take advantage of all the experience that comes out of our processor research,” Berlin says.

The significance of Intel’s approach is that it can boost a molecule’s signal so dramatically–between 100 and 10,000 times, depending on the molecule being studied–that it will allow observation of single molecules without chemically altering them. “The Intel experiments are the first that demonstrate the great potential of this kind of Raman technique for detecting single molecules,” says Eric O. Potma, who is working on similar research at the University of California, Irvine. Also, while fluorescent labeling is used only for taggable molecules, Intel”s research will likely find broader applications. “With single-molecule Raman, we might be able to monitor the details of molecules that have remained invisible to us with fluorescence spectroscopy,” says Potma.

The ability to better see how molecules operate could help fulfill a dream cherished by many biologists. “Being able to study single molecules will transform our thinking,” says cell biologist Mark Roth of the Fred Hutchinson Cancer Research Center in Seattle, which is collaborating with Intel on this project.

Bell Labs
Quantum Computing
By Dan Cho

More than half a century after inventing the transistor–the foundation for modern electronics, computing, and telecommunications–Lucent Technologies’ Bell Labs is pursuing another technology that could radically change information technology: quantum computing. Today’s transistors continue to get smaller, allowing computer speeds to double every one or two years. But a quantum computer would leap way ahead of that pace. If such a machine is finally built, it will offer the ability to solve certain problems millions of times faster.

A conventional computer stores information as bits, which are represented as 1s and 0s. Quantum computers rely on quantum bits, or qubits, which can hold values of 1, 0, or–and this is the part that defies intuition–some quantum blend of those two values. Another quantum effect known as “entanglement” allows two or more qubits to coordinate their behavior, even when they don”t appear to be interacting.

These strange properties would make qubits extremely powerful tools for attacking certain computing problems, such as factoring large prime numbers in encryption applications and searching huge databases. (Two Bell Labs researchers, Peter Shor and Lov Grover, devised breakthrough quantum algorithms for solving these two problems in the 1990s.)

But creating the hardware that can harness qubits presents a huge challenge. Qubits are encoded as the spins of individual particles like atoms, ions, or photons. These particles must be isolated so that they can”t interact with their surrounding environment, which would ruin the quantum computation. Bell Labs researchers, like several other groups, are pursuing a method for controlling qubits with a device called an ion trap.

Each trap is between a tenth and a hundredth of a millimeter long and has tiny electrodes that can hold an ion in place above it in an electric field, while a laser beam alters the ion’s spin. When the computation is complete, the ion is excited by a different laser, causing it to give off photons that can be recorded by a camera to reveal its final state, which represents part of the answer to a problem.

Research groups working with trapped ions have so far produced quantum computations using fewer than 10 qubits. To be of any practical use, though, a quantum computer will require hundreds or thousands of qubits. The qubits might be held in an array of many traps, known as a multiplex system, with connections for shuttling ions back and forth between different regions to prepare them for a computation, read their final states, and even store them in memory.

While most ion traps are currently made of ceramic, Bell Labs is working to design a multiplex system in silicon. Transistors could supply voltage from an external source wherever it”s needed, eventually allowing researchers to position thousands of ion traps on a single chip, says Richart Slusher, head of Bell Labs” quantum computing team. Bell Labs expects to fabricate some of these multiplex traps in the next two years, says Slusher.

The Bell Labs group has “thought about the long-range problem, including how you do all the electronic controls,” says David Wineland, head of the Ion Storage group at the National Institute of Standards and Technology, a leading center of quantum computing research. According to Wineland, the ceramic traps that scientists have been using in current experiments have “obvious limits.” But what will ultimately replace them, he says, “is still open for question.”

Building ion traps on silicon would allow researchers to take advantage of the semiconductor industry”s decades of working knowledge. David Bishop, Bell Labs” vice president for physical-sciences research, thus believes that all the basic technologies for quantum computing are ready–or that they soon will be. “We don”t see any fundamental show stoppers,” says Bishop.

Still, most researchers in the field, including Wineland and Slusher, do not expect a practical quantum computer to appear for at least another decade. Even then, the first machines will be built to solve very specific computing tasks. And while solving just, say, the factoring problem would have profound implications in cryptography, a quantum computer may not be any better than a conventional machine for many of the tasks that a desktop PC routinely handles.

None of this dissuades Bell Labs–which has eliminated much of its fundamental R&D in recent years–from pursuing what is, really, still a basic research project. Part of its motivation is the belief that the hardware research may pay off for Lucent long before quantum computers arrive, yielding advances in areas such as miniaturized lasers and optical components. “What we learn from working in the quantum computing field may someday lead to commercialization,” says Bishop, “but more importantly, it also drives discoveries that could improve today”s communications and computing technology.”

Keep Reading

Most Popular

Geoffrey Hinton tells us why he’s now scared of the tech he helped build

“I have suddenly switched my views on whether these things are going to be more intelligent than us.”

ChatGPT is going to change education, not destroy it

The narrative around cheating students doesn’t tell the whole story. Meet the teachers who think generative AI could actually make learning better.

Meet the people who use Notion to plan their whole lives

The workplace tool’s appeal extends far beyond organizing work projects. Many users find it’s just as useful for managing their free time.

Learning to code isn’t enough

Historically, learn-to-code efforts have provided opportunities for the few, but new efforts are aiming to be inclusive.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at with a list of newsletters you’d like to receive.