September 26th, 2011
Posted by Shane Climie, Ph.D.
In his latest Wall Street Journal column (“Drugs That Are as Smart as Our Diseases”), biologist/author Matt Ridley bemoans the plummeting efficiency of drug discovery in the pharmaceutical industry. He points to a disturbing paradox: while identifying and sequencing genes of pathogens and cancer cells has become much cheaper in a short period of time, the number of new drug candidates (based at least in part on our knowledge of those genes) has dropped. According to Ridley, new molecule approvals per billions of dollars of inflation-adjusted R&D amounts to no more than one percent of the number of approvals in 1950. And as we’re all aware, this decline in innovation is all the more dire because the pharmaceutical industry needs to replace so-called “blockbuster” drugs that are about to lose their patent protections if it is to continue to keep investors satisfied and fuel future innovation.
So, why hasn’t the same industry that gave us statins, Herceptin®, and vaccines come up with a new generation of treatments? The biggest problem might lie in its success. Researchers today confront an enormous—and growing—amount of genetic and biochemical information as the search continues for newer, more effective drugs. As we generate more data, we are increasing our understanding of the complexity of biological processes underlying disease states. While this better understanding can lead to innovation, it also has uncovered obstacles. For example, scientists have found that signaling pathways leading to cancers are replete with redundancy, shortcuts and other molecular detours that block the activity of cancer drugs. Sometimes, these pathways can help eliminate or prevent cancer; at other times they can exacerbate it.
Read the rest of this entry »