Sunday, March 23, 2008

Ten Reasons Why Drug Discovery Is So Hard

1 New drugs are new. To develop a drug from a newly discovered molecule for a hitherto untreated condition is exploring terra incognita. Creating a new drug can mean developing a new biological concept, a new manufacturing process, and a new way of measuring a clinical response. What worked last time may be irrelevant for a new therapy.

2 Men are not mice. Drug discovery usually starts in the laboratory, progresses to animal models, and ends in human trials. But, as an example, the arthritis created by injecting mice with collagen is not really rheumatoid arthritis. Translating what scientists observe in mice to what will actually happen in humans is difficult - and often disappointing.

3 People vary. Laboratory experiments pay great attention to uniformity and control. But the testing of new drugs in humans confronts the defining characteristic of people – their heterogeneity. Beyond differences in sex, age, ethnicity, and drug metabolism, disease itself varies according to severity, preexisting conditions, and the influence of other medications. A drug effect observed in the laboratory is often overwhelmed by the variability of real, living humans.

4 Nature is conservative. The approval rate for new drugs that make it to clinical trials is around 20%, suggesting that something is working actively against new compounds. Human bodies expel foreign molecules, drugs included.

5 Manufacturing drugs is expensive. A new drug needs to be manufactured by new methods, often in a new plant that must pass inspection by regulatory authorities. The investment in such manufacturing facilities and processes must be made years before regulatory approval. Only the drugs most likely to succeed will ever justify this risky investment.

6 Ars longa, vita brevis. Most drugs take around ten years to develop. Human attention is weak over such a long period; markets change, companies run out of cash, and things go wrong.

7 Almost no one knows how to develop drugs. A project may take ten years or more; few people stay in one job for more than about three years. For all these reasons, few researchers have seen the development from beginning to end. There is no academic discipline that prepares people for careers in drug development. Everyone learns as they go. And no one shares what they have learned outside the company.

8 There are no shortcuts. Many assume that opaque regulatory requirements are the main reason it takes so long and costs so much to develop a drug. Not so. Agencies explain clearly what they want to know about toxicology, efficacy, safety, and standards for new drugs. But it simply takes time to accumulate the requisite knowledge about a novel substance.

9 It doesn’t get any easier. As science develops better, more-effective therapeutics, the bar gets higher for the next generation of drugs. When one drug becomes a standard, its successor will likely take longer to develop and cost more.

10 There are no revolutions. Now, as ever, progress in drug discovery is incremental. Understanding the molecular basis of disease will allow the development of specific and more-predictable treatments. Better information systems will help to order the immense amount of data now generated in the laboratory. But neither genomics nor bioinformatics will magically transform drug discovery.

Contributed by Barry Sherman and Philip Ross. Originally published by them in The Acumen Journal of Sciences, Volume I and reprinted with their permission.

Thursday, March 20, 2008

The Failure of Industrialized Research (Part 3)

Drug discovery is unpredictable and unmanageable. So why do large pharmaceutical companies spend so much money on it?

The glimmerings of the approach were visible in the development of captopril. Squibb’s chemists made systematic alterations in organic molecules that were then tested for activity against a targeted enzyme. A new route was added in the 80’s, when the nascent techniques of recombinant DNA allowed a protein of interest to be traced to its gene, the gene inserted into a cell, the cell bred in the billions, and the protein extracted. Among the first proteins so manufactured was human insulin, a predictable success given that the role of insulin in treating diabetes was already understood.

The drug industry expected its biologists to discover, in short order, a host of protein therapeutics, ushering in an era of discovery based on biology, rather than chemistry. Indeed, several blockbuster protein drugs were produced, almost all conforming to the model of human insulin: scientists knew ahead of time that the drugs would work, if only the protein could be made economically. But for each success, there were a great number of costly failures.

The 90’s saw an immense increase in spending on industrialized research in chemical approaches. Generally, new technologies employed robotic systems to process large numbers of drug candidates in a blind search for a desired biological activity. Combinatorial chemistry starts with a molecule of interest, then adds to it randomly to form a burgeoning family of molecules. The technique formed the basis of biotech startups, like Darwin Molecular (now Chiroscience R&D), AXYS Pharmaceuticals (since acquired by the Celera Genomics Group), and Affymax. To manage the plethora of resulting molecules, other firms used technologies pioneered in Silicon Valley to build expertise in high-throughput screening.

Also in the 90’s, robotics was combined with the polymerase chain reaction, an established technique that quickly and reliably copies selected sections of DNA, to create machines capable of mass-sequencing genes. This effort produced a wealth of raw data that promises to yield new drug targets for both biological and chemical approaches. A number of companies besides Celera, including Millennium Pharmaceuticals and Human Genome Sciences, are mining data for new targets, hoping either to develop drugs themselves or sell the information to pharmaceutical firms.

More advanced drug discovery tools appear every year. Proteomics companies like CuraGen, Myriad Genetics, and ProteoMetrics were founded to categorize the blizzard of proteins synthesized in human cells, a far greater challenge than transcribing the human genome. Functional genomics, another buzz phrase, has been seized upon as the “the next big thing” by companies like Affymetrix, Celera, and Human Genome Sciences, all of which are trying to link raw genetic data to their precise physiological function in the cell. To accelerate, and thus cheapen, the multifarious chemical analyses, other startups hope to integrate them all into “labs on a chip,” in which tiny, etched channels pipe droplets of reagents from one reaction chamber to another. Still more high-tech is “virtual screening,” the plan to replace actual laboratories – even those reduced to a chip – with computer simulations.

It is impossible to measure the success of these technologies against the sums spent on them. Private investment in genomics alone came to $1 billion in 1996 and $2 billion in 1997, rising in multiples every year thereafter until the bubble burst in 2000. It is not unreasonable to conclude that for genomics alone, some $15 billion in funds were raised. Meanwhile, proponents of industrialized research cannot yet point to astounding drug discoveries their ideas have engendered.

The failure of industrialized research to produce blockbuster results for pharmaceutical companies suggests several conclusions. The billions of dollars spent in pursuit of innovator molecules has (largely) been wasted or will be rewarded over a much longer time frame than desired. Drug discovery is neither predictable nor inherently manageable. Serendipity remains the most obvious explanation for the majority of new discoveries.

At university laboratories, where serendipity is understood, creativity is valued, and researchers are not subject to corporate management. Moreover, these labs are more numerous than industrial labs, and remain the most productive source of genuinely new ideas. Small, single-minded biotechnology firms are best suited to the early development of NMEs and biologics. As these firms become larger and more successful, they become turgid, less able to develop new ideas. And pharmaceutical companies are the organizations that can most effectively validate new research, shepherd novel drugs through the later stages of development, manage their regulation, and commercialize and market new therapeutics. To that end – and in the hope of a lucky break in discovery – it is reasonable for them to invest in large staffs of researchers.

But the scale and unrestrained growth of the pharmaceutical industry’s investment in research is irrational. Such investment will not produce the wealth of new drugs for which the industry longs. The unclogging of the drug pipeline will come from intelligent investment in biotech firms and academia. To prescribe how much money large pharmaceutical companies should spend on R&D would be a speculative exercise at best. However, if the real function of pharmaceutical R&D is not discovery, but rather the validation, development, and reformulation of existing molecules, then much of what is currently spent on discovery would be better spent on licensing, venture investment, and acquisitions – as well as in development itself.

Pharmaceutical companies will always have discovery research labs. They will never exactly resemble high-technology businesses (where networking companies like Cisco Systems effectively outsource all of their science to startups and universities), but they will look more like high-tech firms than the ossified institutions they are now.

Part 3 of 3.

Contributed by Barry Sherman and Philip Ross. Originally published by them in The Acumen Journal of Sciences, Volume I and reprinted with their permission.

Sunday, March 16, 2008

The Failure of Industrialized Research (Part 2)

Drug discovery is unpredictable and unmanageable. So why do large pharmaceutical companies spend so much money on it?

Screening was begun early in the 20th century by such pioneers as the German microbiologist Paul Ehrlich. Having learned that a compound of arsenic killed the microbe responsible for sleeping sickness, Ehrlich sought a related compound that would also not poison the patient. He tried some 900 compounds in mice, to little avail. Then his colleague Sahachiro Hata tested one on the microbe that causes syphilis. This compound killed the microbe, but not the experimental animals. A year later, in 1910, Ehrlich released salvarsan, the first synthesized drug to cure a disease.

How the drug worked played no role in its discovery. Why a compound killed germs without harming patients overmuch; what the microbes for two quite different diseases had in common – these puzzles went unexplained. The breakthrough – the idea that arsenic could kill germs – came from inspired observation; applying that discovery entailed laborious screening. No one dreamed that inspiration could be made routine or that works of genius could be produced on a corporate schedule.

Around the time salvarsan hit the market, European scientists were inferring a relationship between diabetes and the pancreas. After a decade of fruitless efforts, the Canadian physician Frederick Banting had the critical insight: if a pancreas is ground up, its digestive secretions destroy its diabetes-moderating secretions. Banting and his colleague Charles Best found a way to isolate the latter and successfully treated diabetes in dogs. Soon afterward, in 9122, they gave the world insulin – the first treatment for diabetes. No step in this process would have lent itself to industrialization; if we had to discover insulin and other hormones all over again, we might very well have followed similar paths.

Beginning in the 70’s, as knowledge of the molecular basis of disease increased, the pharmaceutical industry began to hope for steady discoveries based on a different model. Instead of searching for a compound that would simply, say, lower blood pressure in model organisms, pharmacologists could target specific enzymes, like the angiotensin 2 converting enzyme (ACE). In fact, the first such ACE inhibitor was discovered as the result of just such a request, by scientists at Squibb in 1977.

But even the roots of this discovery lie in serendipity. A doctor noticed that deaths from snakebite in the banana plantations of Brazil involved catastrophic declines in blood pressure. Researchers in London isolated the causative peptides and the search for the drug was taken up by two chemists at Squibb, David Cushman and Miguel Ondetti, who synthesized one and showed that injections of it lowered blood pressure. Squibb saw no market for an injectable antihypertensive and shelved the project. Then, in 1974, research in a related enzyme system – again, by academics – showed how molecules small enough to be effective if taken orally might do the job, and the Squibb team developed captopril. It gained FDA approval in 1981, starting an avalanche of development at other companies that has led to more than ACE inhibitors.

The saga of captopril shows the pharmaceutical industry at its very best, developing a drug that was at once a success for science, for patients, and for a company’s bottom line. Yet it hardly promised the industry what it yearned for: a regular supply of bankable discoveries. Since then, the bar has risen: demand for new ideas has increased as the number of important diseases lacking a treatment has dwindled.

The traditional, itinerant investigator, as in the Ehrlich example; the hypothesis-driven project, as in the insulin story; or applied research, as in case of ACE inhibitors – all relied on luck and brilliance to find new drugs. In its search for an alternative to these methods, the industry fell in love with a new approach to research that promised to minimize serendipity and maximize predictability: the industrialization of scientific research.

Part 2 of 3. Part 3 to follow shortly.

Contributed by Barry Sherman and Philip Ross. Originally published by them in The Acumen Journal of Sciences, Volume I and reprinted with their permission.

Thursday, March 13, 2008

The Failure of Industrialized Research (Part 1)

Drug discovery is unpredictable and unmanageable. So why do large pharmaceutical companies spend so much money on it?

Observing the pharmaceutical industry from a prudent distance, one might wonder at its apparent confidence in it ability to discover new drugs. Pharmaceutical companies maintain great staffs of scientists and spend increased sums of money on technologies meant to increase the chance discovering new medicines. Their research-and-development budgets surpass that of the National Institutes of Health. And this investment is only growing: global expenditures on R&D has doubled over the past 11 years, from $22.2 billion in 1991 to $44.5 billion in 2001. If the investment by pharmaceutical companies continues to grow at this pace, R&D spending could reach $57 billion in 2006.

But closer inspection reveals a contradiction: in most cases, these large corporations purchase seminal drug ideas from smaller, entrepreneurial biotechnology companies and university research laboratories. In 2001, the 14 major pharmaceutical companies were responsible for discovering merely 26% of the 32 biologics and new molecular entities (NMEs, defined by the Food and Drug Administration as active therapeutic ingredients that have never been marketed in the United States), while being responsible for 65% of global R&D spending. Biotech and academic labs produced the remainder.
[2] Furthermore, the productivity of pharmaceutical companies is decreasing: the number of medicines in the early stages of development (preclinical and clinical Phase I and II) dropped by more than 20% from 1999 to 2001. [3]

Finally, consider this: most of the NMEs that big pharmaceutical companies do develop are not focused on new targets; they do not create entirely new medicines or open up new markets. Of the 32 FDA-approved medicines that came to market in 2001, only 5 were novel therapeutics or were directed against new targets.
[4] None have their roots in large pharmaceutical companies (see “FDA Novel Therapeutics or New Targets, 2001,” page 49).

A critical assessment of these products reveals that many are me-too drugs, that is, they are structurally similar to and affect the same targets as drugs already on the market. What pharmaceutical companies call NMEs ver often aren’t new at all: of the 13 new drug applications (NDAs) approved thus far by the FDA in 2003, 8 are reformulations or combinations of drugs that have already been approved.
[5] Xanax, in a new extended-release formula, makes the list. Another, Cardizem, was first approved in 1999. [6]

“A company with a research budget of $4 billion should be turning out five new drugs a year, and they’re not,” says Charles Grudzinskas, an industry consultant and professor of pharmacology at Georgetown University Medical Center.

What, then, do those armies of corporate scientists do? Do pharmaceutical companies accumulate them as tokens of prestige? Or is this merely a fossilized behavior that once made economic sense?

For large pharmaceutical companies, the issue is, predictably, a sensitive one. Most of the senior executives we contacted for this essay declined to be quoted on the record. But one put it this way: “Oh, we do it for a variety of reasons. Because revenues permit it; research is only a fraction of out total budget. Because it only takes one success to win big; it’s a venture mentality. Because consolidation is inevitable, and bigger looks better. And finally, because management has a vested interest in wasting money. Our stock price is supported by the appearance of large-scale research and a full drug pipeline.”

The executive was, perhaps, being excessively cynical. His reasons, while partly true, are subsidiary: the best explanation of why pharmaceutical companies invest in early-stage discovery is that they have their reasons (even if, as will be discussed later in this essay, the scale of their investment is irrational). According to UBS Warburg, only 20% to 25% of the R&D spending by pharmaceutical companies is for drug discovery itself; the great majority is spent on development. Research is thus a sunk cost of being in the drug busniness: since the best, most innovative ideas are purchased from others, big companies need scientists to validate outside discoveries and to develop those discoveries once they have been purchased. Industry-based scientists analyze drug candidates for toxicity and other risks in cell cultures, animals, and people. Finally, they optimize molecules and create variations on known themes. In other words, pharmaceutical companies need experts to validate potential acquisitions and for drug development; therefore, they must invest in both.

Increasingly, the secret ecology of the big pharmaceutical industry is to outsource drug discovery. In 2000, 55% of big pharmaceutical companies’ approved products were “in-licensed,” in the jargon of the trade. In 2002, this fell to 36%, and increased mergers and acquisitions suggest a growing reliance on external discoveries: deals and alliances for preclinical and discovery products increased by 85% between 2000 and 2002.

Consider the case study of Bristol-Myers Squibb and ImClone. After spending $14 billion in current dollars on research over the course of a decade, Bristol-Myers Squibb felt it necessary to spend another $2 billion to acquire the rights to Erbitux, the anticancer drug candidate from ImClone (see “imClone Systems,” page 26). Amgen, whose first two blockbuster drugs trace back to before the company’s founding, has spent $1 billion a year – 25% of its sales – on research, yet it, too, has gone to outside sources, notably spending $9.6 billion to acquire Immunex and, and, with it, the arthritis drug Enbrel. Even Human Genome Sciences, which has defined itself by its powers in drug discovery, has resorted to outsourcing, paying $120 million to acquire Principia, mainly for its long-acting growth hormone technology.

How did this happen? By now, the industry had expected to be feasting on the fruits of a revolution in fundamental research technology. Genomics and proteomics were to have produced vast quantities of molecular targets in the body (and in the organisms that infect it). Combinatorial chemistry was to have made a blizzard of arrows, some of which would surely hit those targets. High-throughput screening was to have shot those arrows at myriad targets to see which ones stuck. The resulting plethora of new ideas was forecast to be almost more than the development side of the industry would be able to exploit. But for all their sophistication and promise, these tools have as yet contributed few new product ideas.

Traditionally, companies have made substantial investments for research by their own scientists into the types of drugs they already market. Several other approaches to drug discovery have been tried. Some companies have even freed their scientists to do pure research, without corporate interference, at entities like the Merck Genome Research Institute, the Bristol-Myers Squibb Pharmaceutical Research Institute, the Pfizer Technology Discover Center, and DNAX, which was purchased by Schering-Plough to carry out basic research in immunology. Some companies have outsourced their research by sponsoring academic research in return for first dibs on the commercial rights to any discoveries. As far back as 1980, Germany’s Hoechst entered into such an agreement with Massachusetts General Hospital (but got little out of it).

None of these strategies, individually or in combination, has provided the predictable flow of promising new therapeutics that the industry requires to maintain its profitability. Disenchantment with the unpredictability and unmanageability of a discovery model driven by investigators’ curiosity and instinct has propelled the movement toward an industrialized model of research with large-scale screening as its foundation.

Part 1 of 3. Part 2 to follow shortly.

Contributed by Barry Sherman and Philip Ross. Originally published by them in The Acumen Journal of Sciences, Volume I and reprinted with their permission.

1 Centre for Medicines Research International Pharmaceutical R&D Expenditure and Sales 2001: Pharmaceutical Investment and Output Survey 2001: Data Report I.
2 Ibid.
3 Ibid.
4 Zambrowicz, B., A.T. Sands (2003) Knockouts model the 100 best-selling drugs – will they model the next 100? Nature Reviews: Drug Discovery 2(1) 38-51.
5 FDA (2003) New Drug Approvals for Calendar Year 2003,
6 FDA/Center for Drug Evaluation and Research. Reports to the Nation 1993 – 2001 and Drug Topic Archives New Drug Approvals 1995 – 2001.
7 Burrill & Company (2003) Biotech 2003: The 17th Annual Report on the Industry.

Saturday, March 1, 2008



If the Pharmaceutical Industry's life blood is demand generation, prescription compliance and a robust pipeline, two of the (former) three are being pumped up by a novel program in New York City. As discussed in The New York Times on February 26, 2008, Page B3 (, ) Mayor Michael Bloomberg announced that after a 2 year development effort and more than $60 million of public funding that the city was ready to equip any or all doctors with computer software (from eClinicalWorks) that can track patients records in order to provide better patient care. They project 1,000,000 patients on this system by year end!

I can only imagine what a bonanza this could be for the Pharmaceutical Industry! Assuming the system is successful and it is likely to be successful (this is my optimism at work as public funded systems projects are rarely viable on initial rollout and the complexity of the various stakeholders only exacerbates the problem) , the potential benefits for the patient and supplier are enormous There could be tightly integrated and compliance monitoring capability easily added so that it would be easy to inform/remind patients about their scripts; on line adjudication could be another step away and electronic script writing fused together to make for a far more effective and efficient health care delivery system.

The value to the industry suggests to me that the business, commercial and market development organization should start understanding and promptly backing such a system on a nationwide basis. Mayor Bloomberg suggests that “This can do for health what the Bloomberg terminal did for finance” (and by the way made the mayor a multi-billionaire). If he is even partially correct, the payback for all is monumental.

What do you think??

Send comments to

Contributed by: Lawrence J. Rothman, PhD