Skip to Content

Lean Mean R&D Machines

Leading companies want research units that can adapt to changing technologies and corporate business strategies.
December 1, 2001

The pharmaceutical industry has a drug problem: it can’t find enough new ones. Companies are under pressure to invent the next Prozac or Viagra, but without more efficient and cost-effective ways to develop new drugs, those kinds of blockbuster medications will remain once-in-a-decade discoveries. “It costs too much, it takes too long and it produces too little,” says Rod MacKenzie, a vice president in Pfizer’s Global Research and Development division, of the industry’s traditional method of drug discovery. “The problem is, how do you change the engine while the vehicle is still moving?”

One solution to the problem of retooling Pfizer’s $4.4 billion R&D engine is its Discovery Technology Center, a gleaming two-and-a-half-year-old facility in Cambridge, MA, where the company’s scientists team up with academic researchers and small biotech firms to develop computerized methods for screening thousands of potential drug molecules per day. Key to the lab’s strategy are its small size (just 70 researchers out of the 3,000 involved in drug discovery across Pfizer) and its location, at the center of the Boston-area biotech hothouse and a healthy distance from Pfizer’s main R&D facility in New London, CT. “We’re small and we’re offline in the sense that we don’t have the same day-to-day pressures of productivity that the other sites do,” says MacKenzie, who directs the center. “And what happens here in Cambridge is that people beat a path to our door,” including researchers from some of the area’s top academic institutions.

The pharmaceuticals industry is hardly the only sector where researchers are under increasing pressure to find new ways of zeroing in on high-growth products and technologies. MacKenzie’s critique of conventional research methods in the drug industry could be applied equally well to the chemical, aerospace, transportation, telecommunications and information technology sectors. And in each of these sectors, leading companies are looking for new ways to make their research groups nimble enough to react to ever changing technologies-and market opportunities.

For many, that means getting their researchers more connected-with each other, with their firms’ customers and, as at the Discovery Technology Center, with their peers in academia. It can also mean looking more closely than ever before at their portfolios of research projects, personnel and capabilities, and loading this information into so-called knowledge management databases that guide decisions about which potential products and technologies to pursue. These approaches and other new strategies are gaining urgency as growing economic uncertainty-coming after years of good times and loose wallets-reminds chief technology officers of the need to prove the value of their companies’ R&D spending.

Technology Review’s second annual Corporate Research and Development Scorecard shows respectable increases in R&D spending in the 2001 fiscal year at the majority of companies. But the scorecard does hint at stormy financial days ahead. Spending is flat or declining at some of the United States’ most notable technology firms, including Exxon Mobil and computer and telecommunications giants such as Compaq Computer, Silicon Graphics, Computer Associates, 3Com, Qualcomm and AT&T. And while most firms’ R&D spending has remained steady this year when measured as a percentage of revenues, forecasts for plummeting revenues as the economy heads into a recession are likely to translate into less money for research next year. “We have seen over the last several years a tremendous increase in the rate of growth of industry R&D spending, but you can’t sustain that rate of growth economically,” says Jules Duga of Battelle, a nonprofit research institute in Columbus, OH. What’s more, he says, “The cost of doing R&D keeps going up, so you have to spend more and more to gain less.”

What the scorecard data don’t show is the growing collection of industry R&D collaborations and new management approaches designed to counter just these challenges. Technology companies, in essence, are looking for ways to get a bigger bang for their R&D buck.

Making Connections

In the pharmaceuticals industry, where R&D costs have been rising for years without any commensurate rise in the number of new drugs reaching the market, it’s long been clear that old research models needed revision, says MacKenzie. Implementing a discovery process that produces more drugs over less time requires freeing researchers from the “pressures of productivity” that can keep them from experimenting with risky new technologies. At the same time, even a state-of-the-art research center needs to keep its work aligned with business needs. To that end, Pfizer’s Discovery Technology Center hires only researchers who “have scientific degrees but are also outstanding data miners or statisticians on the side,” says MacKenzie. Such people tend to be “totally immersed in the business of drug discovery, not off to one side of it, which is incredibly important to what we do.”

To guarantee a supply of such researchers, Pfizer cultivates close ties with the local academic community. Last year, for example, the company created a three-year fellowship program in computational biology at Cambridge’s Whitehead Institute for Biomedical Research. Fellows are expected to work inside the center for part of that time, examining gene sequences or protein structures relevant to drug discovery, but are also encouraged to do independent research. And for more great ideas, the center isn’t opposed to turning to smaller companies and technology suppliers. Cambridge, MA-based BioTrove, whose nanoscale liquid-handling technology allows researchers to mix tiny quantities of reagents with 10,000 or more separate drug targets on a single chip, is conveniently headquartered right inside the Pfizer center’s facility.

One of the basic tenets of the Pfizer center is that research is more effective-and more profitable-if it’s more connected to the world outside the company. And Pfizer is hardly the only high-tech company testing this hypothesis. Chip-making giant Intel, for example, is spending part of its $4 billion R&D budget this year to support a series of “lablets” adjacent to top universities, each directed by a faculty member who has taken a leave of absence for a year or more. Each of the 20- to 30-person lablets will focus on a promising young technology (see “Intel Revamps R&D,” TR October 2001). For example, computer scientist David Culler, the founding director of an Intel lablet at the University of California, Berkeley, is developing the software infrastructure for networks of tiny sensing and communication devices. If such devices eventually permeate our surroundings-gathering and wirelessly sharing information that could be used in surveillance, environmental control or scientific measurement-Intel wants to be the firm that builds them.

Key to this program, says Intel’s director of research David Tennenhouse, is the fact that the academic researchers heading the lablets have a strong desire to see their ideas applied in the real world. Intel, however, is discouraging the researchers from taking their technologies all the way to the commercialization stage or becoming business unit managers, which might keep them from doing what they do best-innovate. “We’re saying, Work on this strategic research project for a few years, and if it succeeds keep moving downstream [toward the market] for a few years, but then cycle back to the lab until you have another idea that you want to foist on the world,’” says Tennenhouse.

Foot in the Door

f Intel and Pfizer are dismantling the old walls between corporate and university research, IBM is blurring another traditional boundary-that between corporate research labs and the company’s own customers. The idea behind Big Blue’s new Emerging Business Group is to offer small startup firms access to IBM’s extensive research in information technology. In return, IBM may get a small amount of cash or equity, but the main point is to encourage the smaller firms to build their own new technologies on top of IBM software and services. “Basically, we get IBM platforms into those companies so that when they succeed, we have a growth market for our products,” explains Dave McQueeney, IBM’s vice president of emerging business. “If we pick the startup companies that will grow up to dominate new spaces, it could be very smart for us.”
As an example, McQueeney points to work on online auctions at IBM’s Thomas J. Watson Research Center in Yorktown Heights, NY. A team there is developing “combinatorial optimization” software that compares bids in Internet-based auctions where the bidders seek to buy distinct but overlapping sets of items. One bidder-say, an electronics manufacturer-may need components

Spend More and More to Gain Less

A, B and C to build a CD player, while another may need B, C and D to build a VCR. The software solves the surprisingly complex problem of knowing who, from the seller’s point of view, has placed the most lucrative bid.

IBM could commercialize the auction software on its own, but it might actually be wiser to hand the technology over to a startup. “Say you are an auction company,” explains McQueeney. “We say to you, here is this capability that we’ve had five PhD mathematicians working on for three years. We’ll make it available to you now, but it’s useless to you unless you run it on our infrastructure,” meaning IBM’s software and servers. Not only does the auction company get a big bang out of being first to market with the new software, but once other companies start adopting the technology, IBM gets to sell more infrastructure software. Just as important, IBM researchers are involved throughout the process, meaning they stay plugged into the latest trends and opportunities in their customer communities.

Cutting the Fat

he profusion of new, more fluid corporate research models can create a new problem: monitoring the results. If managers don’t have a good real-time picture of what their researchers are up to, it’s easy to wind up wasting money on redundant studies, underfunding promising technologies or letting moribund projects linger too long. But lately companies such as 3M have been getting serious about knowledge management, the use of searchable databases or intranets to archive what individuals and groups in an organization know and describe what they can do. Not only can that information foster creativity and new collaborations, it can also help chief technology officers decide whether to increase spending in areas of research viable in the current market or cut the fat where it’s unlikely the company would be able to compete well.

3M managers companywide have strapped on their seat belts for a comprehensive spending review of R&D efforts instigated by new CEO W. James McNerney, who started last January. The first step was to build a database showing “where all of our [R&D] money was going in great detail, which was not something 3M had done before,” says Steve Webster, vice president for research and development in 3M’s corporate technology and transportation division. The database allowed “some very specific discussions about which opportunities are likely to have the biggest payoff, some that may not be so interesting and some that no one may be working on but which actually may be better opportunities than what other parts of the company are working on,” says Webster.

As one result, 3M has decided to put even more money into building new electronic displays based on organic light-emitting diodes, already identified as a potential successor to the company’s liquid-crystal display technology. The company also discovered it had underestimated its expertise in an important area: software design. In developing its knowledge database, the company surveyed each business unit about the R&D programs it depended upon. One of the technologies mentioned again and again in the survey belonged to a category called “Other.” “We had to open the box and ask, what is it about Other’ that provides so much growth?” recounts Webster. “In this case it was software and electronics,” such as the technologies 3M sells to libraries to catalogue and track their books and deter theft. In fact, so many of 3M’s products now depend on computers and software that the company realized it needed to shore up research spending on information technology. “When you think of 3M you think of films and coatings, but our systems integration is actually very important,” Webster says. “Now we can quantify that”-and allot R&D resources accordingly.

Balancing an increased market focus-and closer ties to customers-with the pursuit of world-class science is now the trick for many corporate R&D groups. Gone forever are the days when large industrial labs churned out scientific papers and conducted long-term research far removed from business pressures. But while R&D groups have clearly gotten more business friendly over the last decade, they are also feeling pressures to come up with tomorrow’s high-growth opportunities.

Indeed, industrial-R&D expert Richard Rosenbloom, an emeritus professor at Harvard Business School, is convinced that most high-tech firms-especially in the information technology sector-“still aren’t doing enough to invest in the future technologies that will be the next big revolutionary business. They’ve been experimenting with new venture units, spinoffs, joint ventures and the like, but I don’t know of a single big corporation that has a track record that is exceptional in any of those initiatives.”

But if innovative efforts at doing research, like those being implemented at Pfizer, Intel and 3M, do ultimately help their companies pull ahead of the pack-and do so cost-effectively-other corporate R&D teams may soon be looking to turbocharge their own technology engines.

Keep Reading

Most Popular

Large language models can do jaw-dropping things. But nobody knows exactly why.

And that's a problem. Figuring it out is one of the biggest scientific puzzles of our time and a crucial step towards controlling more powerful future models.

How scientists traced a mysterious covid case back to six toilets

When wastewater surveillance turns into a hunt for a single infected individual, the ethics get tricky.

The problem with plug-in hybrids? Their drivers.

Plug-in hybrids are often sold as a transition to EVs, but new data from Europe shows we’re still underestimating the emissions they produce.

Google DeepMind’s new generative model makes Super Mario–like games from scratch

Genie learns how to control games by watching hours and hours of video. It could help train next-gen robots too.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.