Updated on 8 February 2013
Mr Leigh Berryman is a professional, accredited toxicologist with over 33 years of experience in the biopharmaceutical industry. He has held scientific and management positions in Europe, in North America and has been located in Singapore for the last few years. He joined Maccine in Singapore as CEO in 2008. Maccine is a member of BioSingapore, the industry association of Singapore
Recent promising data on drug approvals in 2011 shows that there were 35 new molecular entities (NME) approved by the FDA. This is the largest haul of any year over the last decade with the exception of 2009 where a record 37 new drugs passed the FDA's muster. This should be reason to celebrate and certainly the FDA are patting themselves on the back for hitting their timeline targets under the Prescription Drug and User Fee Act for almost all of these NMEs (what the regulators do not mention is that this is a feat that has rarely been achieved since the introduction of the bill in 1992, but that's another story). However, it would be a rare commentator who would describe the mood of the biomedical science industry as "optimistic". In fact, data from PwC and the Venture Capital Association reports that funding and number of deals concluded in the first quarter of 2012 have dropped to the lowest level since 2003. In contrast, if judged by reported R&D spend, research activity is increasing and is projected to increase further over the next few years. These conflicting indicators reflect a new approach by the industry, possibly one that is resulting in a shift in spend within the process (from development into discovery).
Historically, once a lead was identified, moving a drug candidate through to the clinical stage as quickly as possible with the least possible cost was considered to be of the highest priority. Internal efforts at pharmaceutical companies focused on making the process of drug development, post-discovery, more efficient. Equally, within VC-funded biotech companies, the most important question a CEO had to ask regarding expenditure was "does this move me closer to the clinic?". However, if all that is achieved by streamlining the process is an efficient rush to fail in phase II, by far the most common point of development withdrawal, then this is clearly a waste of resources. Effectiveness of product is more important than efficiency of process and this has been discussed in the industry for many years now. Despite this, the rate of product development to phase III has been gradually getting worse. A total of 34 percent passed through phase II in the four years encompassing 2003-07. Only a quarter made it through the period from 2005 to 2009 and recent analysis by Medical Marketing & Media, a business publication, suggests only a miserable 22 percent passed this stage from 2007 to 2011.
Companies seem to be changing tack and there is now a focus on "quality" of the pipeline. It would be good to report that companies are focusing on areas of high unmet clinical need (or on treatments that have a major advantage over existing therapies), but the definition of "quality" in this context would be a drug that has a good chance of passing through the later stages of clinical development, i.e. demonstrating proof-of-concept in patient trials. Of course as science website and journal Nature suggests, withdrawal during phase II may not only be due to efficacyi, there are also pharmacoeconomic issues in play, but it would be expected that the trial sponsor had done the post-marketing maths and checked the competition before spending millions on a patient trial. In fact, looking at recent (and fairly blunt) analysis of total R&D spending per drug approved since 1997 provides an interesting take on the cost of innovation. For example, Novartis spent just under an average of $4 billion each for their 21 NMEs and their Basel-based neighbours, Roche, spent an average of almost $8 billion for each of 11 NMEs, according to figures provided by Forbes. In many ways, these numbers have little true meaning, but they do highlight the variable costs of development and these costs are of course heavily weighted by the later clinical stages.
It is generally accepted that it makes more sense to focus on enhancing the success rate of a smaller number of candidates by spending more in the earlier, less expensive stages of development. As well as the strategy of "fail fast, fail cheap", this also means properly interrogating mechanisms in relevant preclinical models. In addition, enhancing success rates also means attempting to replicate clinical trial conditions, such as the heterogeneity of populations and testing clinical endpoint technology in animal assays. This is the translational difficulty of research, and to combat this, most pharma and many biotech research groups now have both clinicians and translational scientists working alongside their discovery scientists as part of a project team. Specialist external research is also being accessed and CROs with specialised expertise in this area, like Maccine in Singapore, are working to provide more relevant in vivo models through the use of naturalistic models and implementing clinical technology in the hope that this will better represent the science and methodology of a phase II study (thereby precluding its costly failure).
One of the first of the big pharma to radically shake up their R&D efforts was GSK, a company that historically has been very efficient at throwing large numbers of drugs over the wall into their early clinical pipeline. GSK had 127 drugs in clinical development, according to their February pipeline update. This is a significant drop since 2007 where there were reportedly 158 drugs (according to R&D pipeline intelligence provider Pharma Projects) in the clinic. At that time GSK had responded to the production crisis, like many other major pharma since, by restructuring their research efforts. GSK formed a number of competitive, semi-autonomous drug performance units (DPU) designed to focus on a domain of therapeutic interest. It has been a closely watched strategy and a recent update this year acknowledged three DPUs have been closed, but another four were created and successful ones seem to have been rewarded with 20 percent budget increases in the last round of internal "financing". It will be interesting to see how this plays out. It was indicated in the update that their discovery costs have dropped since the restructure, but they have doubled their phase III pipeline. So, perhaps the revised strategy is working.
In addition to realigning scientific operations and ensuring regulators respect the allocated review time, there are a number of other factors that, in theory, should have been enabling a more efficient development process as time goes on. These include a wide range of high-throughput screening technologies, the digitization of research management and niche CROs driving efficiency through specialisation or repetition. We would expect a marked reduction in the drug discovery and development time over the years and this has indeed been happening. According to studies and projections by Kayhani et al in 2006(mentioned in an article published in Health Affairs journal), if you started developing a drug in 1985, it would have taken about eight years in clinical development and approval stages. By 2001, this had been reduced to just four years after filing an investigational new drug application. However, this is for drugs that actually go the distance and phase II remains the most substantial barrier. To give a topical Olympic analogy, it seems that industry efforts may now be correctly focused on perfecting the run-up towards this hurdle rather than the speed at which it is reached.