Big Data Will Revolutionize Clinical Trials

Clinical trials as we know it are over. Their annual cost is unsustainable and solutions that curtail these expenses must be put in place. Luckily, big data offers unprecedented opportunities that will revolutionize research.



Clinical trials cost US$25 billion in 2006 alone, and pharma companies spend anything between US$100 and US$800million per drug candidate, at a growing rate of 4.6% per year (data for the US). Staggering figures, taking into account the fact that 15-20% of trials never manage to enroll a single patient, and more than two thirds of trial sites fail to meet their original enrolment goals for a given study, leading to delays that not only inflate costs, oftentimes resulting in research being left incomplete or even abandoned. That, in turn, means that progress stagnates and the chances of revolutionary discoveries ever reaching the market are slim. The industry can’t go on like this.

“We have to be wiser and implement new ways of running trials, otherwise there will be no breakthroughs,” said Oriol Solà-Morales, Director of the Pere Virgili Health Research Institute (IISPV), about the current situation.

Easier said than done. The Food and Drug Administration (FDA), and its equivalents across the globe, require rigorous phase I through III trials as part of the approval process, and it’s difficult to imagine that the either of the agencies could abandon its current demands.

I like to say that in the future there will be no trials, there will be IT, which will completely substitute trials".

Nevertheless, some visionaries propose the substitution of the present, thoughtless, high-variance designs by expanded use of genomic, biomarker, and clinical enrichment designs that will produce more informed population selection. In addition to that, experts propose, sophisticated modeling combined with smarter phase II programs are expected to reduce false negative trials. Others go even further, suggesting that in the future trials might be abandoned altogether.

“I like to say that in the future there will be no trials, there will be IT, which will completely substitute trials,”Solà-Morales asserted. A bold statement, given that thus far IT in healthcare has been treated as a companion tool, whereby data is collected and stored, but never analyzed or used. “It’s like using MS Office as a typewriter, instead of taking advantage of the features embedded in the software,” Solà-Morales commented on the current situation.

The future is data

McKinsey Global Institute estimates that applying big data strategies to better inform decision-making could generate up to US$100 billion in value annually across the US healthcare system. Predictive modeling of biological processes and drugs that offers an opportunity to adapt trials appropriately when safety signals are detected in small, but identifiable subpopulations of patients.

The potential of big data is enormous. Dynamic sample estimation, adapting to differences in site patient recruitment, remote monitoring of participants are just a few of many opportunities provided by RWD.

Promising, but before that vision can turn into reality, the industry needs to establish standards of data analysis. Right now healthcare is stuck in the traditional, epidemiological framework, which is very deterministic, meaning analyzing the isolated snapshots, instead of exploring relationships between various data points.

“Say you go to a physician with a fever, and the doctor asks you when it started, what’s your average, what’s your minimum and maximum, etc. He  uses a point estimate, which is important, and discards some alternatives by analyzing the other relevant data: how you got there, the context of disease, is just as important,” Solà-Morales explained. “You need to analyze the relationships between data, not data itself.”

Analyzing complexity is difficult. Already hardware is a limiting factor. “You need powerful tools, which require high level of investment, you cannot do analysis with a simple laptop, you need a tailored computer,” Solà-Morales said.

Hold your horses

But having data that are consistent, reliable, and well linked is one of the biggest challenges facing pharmaceutical R&D. Having integrated data is a fundamental requirement if companies are to derive maximum benefit from the technology trends. Nevertheless, implementing end-to-end data integration requires a number of capabilities, including trusted sources of data and documents, robust quality assurance, and role-based access to ensure that specific data elements are visible only to those who are authorized to see it.

Although appropriate infrastructure might be lacking at the moment, it is just a matter of time before it’s put into place. Four years ago, the industry was skeptical about telehealth initiatives, and it required quite some experimentation before everyone was convinced. Now everyone has a telemedicine device in his or her pocket.

“It’s a matter of practice and getting used to it,” Solà-Morales admitted.

Clinical trials are evolving and while it is difficult to imagine that they may ever become obsolete, it is almost certain that they will change beyond recognition with the help of modern technologies. Sophisticated models, and simulations driven by big data will allow elimination of risky trials, increasing the chances of groundbreaking medicines reaching the market. Suitable infrastructure must be put in place and standards of analysis need to be established before success can be achieved, but it’s only a matter of time before big data takes over clinical trials.


Oriol Solà-Morales will be speaking at Payers' Forum Europe, 1-2nd October, 2014 in Berlin. For more information on his presentation, click here.