Select your localized edition:

Close ×

More Ways to Connect

Discover one of our 28 local entrepreneurial communities »

Be the first to know as we launch in new countries and markets around the globe.

Interested in bringing MIT Technology Review to your local market?

MIT Technology ReviewMIT Technology Review - logo

 

Unsupported browser: Your browser does not meet modern web standards. See how it scores »

{ action.text }

Clogged Pipeline

For the past half-dozen years, drug industry experts such as venture capitalist Jurgen Drews, the former head of research at Hoffmann-La Roche, have been arguing that pharmaceutical companies are running perilously short on new drugs. Drews calls it “the innovation gap”: by his calculation, each pharmaceutical company needs to bring at least one new drug to market every year, and preferably two or more, to survive and prosper. Instead, drug firms have been averaging considerably less-.4 to .8 new drugs per year per company.

The sequencing of the human genome should, in theory at least, solve one aspect of the innovation gap. After all, the entire pharmaceutical armamentarium-all the drugs against all the varieties of human illness-is aimed at a grand total of less than 500 biological targets, severely limiting the number of diseases that can be treated and the strategies used to do so. The information contained in the human genome, say Drews and others, is likely to increase the pool of potential drug targets 10- or 20-fold.

But that bounty will come at a considerable cost. The process of transforming some biologist’s vision of a therapeutic gold mine into a new medication, certified by the U.S. Food and Drug Administration as safe and effective, is expensive and time consuming and getting more so every year. The latest estimate, from an analysis released in June by the Boston Consulting Group, suggests that pharmaceutical companies will spend about 15 years and $880 million for each novel drug that makes it to market.

First, a biological mechanism-a malfunctioning gene, for example, or the errant protein product of such a gene-has to be identified as critical to a disease process, and then that potential drug target has to be “validated,” proven to be truly relevant in the laboratory, whether in cells, in a test tube or in an animal model of the disease. Then drug candidates have to be created-perhaps small synthetic molecules or entire proteins-and screened to determine which of them can enter the body and the bloodstream and the relevant tissues and penetrate the cell in question, reaching the target and altering its function in some way that impedes the disease. This molecule has to be optimized for maximum efficacy with a minimum of side effects: it has to be tested for toxicity and perhaps reengineered; tested in live animals for safety; and finally, tested in humans-in perhaps thousands of patients-first for safety and then for efficacy.

There are many tight spots along this pipeline-and biotech firms are employing a host of new tools to open them up. But the rate of attrition is still staggering. For every drug that makes it to market, 50 or 60 candidates will have failed. And that’s in the pre-genome world, where the great majority of drugs are aimed at variations on those 500 familiar targets and based on well-known biological themes. With thousands of new drug targets, thanks to the Human Genome Project and other genomics efforts, it’s a whole new ball game-one with extraordinary new promise, and an entirely new set of risks. The upside, according to a study by the investment bankers at Lehman Brothers and the management consultants at McKinsey, will be pharmaceutical advances so profound that they “are nearly impossible to imagine, let alone predict.” The downside, says the analysis, could include a fourfold increase in the rate of attrition in drug development-200 drug candidates falling by the wayside for every single drug that makes it to market-and an astronomical rise in research costs. The report’s stunning conclusion: in the short term, the flood of new drug targets could be fatal to pharmaceutical companies. Over the next five years, the report warns, “the industry could go bankrupt by trying to innovate.”

The problem is that the sequenced genomes provide too many potential targets, but not enough biological understanding to go with them, a situation often referred to as “drinking from the fire hydrant.” With so many targets, entirely too many would-be drugs could be rammed down the pipeline and make it to human trials, only to fail after enormous expense. Understanding the functions of genes in order to identify the most promising drug targets is proving to be one of the best ways to help unclog the drug development pipeline.

Enter a host of functional-genomics firms that share a simple strategy: learn as much biology about these potential targets as technologically possible, and do it as quickly as possible. Of the many technologies used to those ends, says geneticist David Altshuler of the Whitehead Institute for Biomedical Research in Cambridge, MA, the most promising are those, like Lexicon’s mice, that will allow researchers to directly manipulate the functions of all of an organism’s genes one by one-and thus pinpoint those that play the salient roles in the causation, progression or prevention of disease.

0 comments about this story. Start the discussion »

Tagged: Biomedicine

Reprints and Permissions | Send feedback to the editor

From the Archives

Close

Introducing MIT Technology Review Insider.

Already a Magazine subscriber?

You're automatically an Insider. It's easy to activate or upgrade your account.

Activate Your Account

Become an Insider

It's the new way to subscribe. Get even more of the tech news, research, and discoveries you crave.

Sign Up

Learn More

Find out why MIT Technology Review Insider is for you and explore your options.

Show Me