Data Quality and Technology in Clinical Trials USA

Feb 21, 2017 - Feb 22, 2017, Philadelphia

Improving quality and reducing timelines in clinical trials through the use of technology and analytics

Eliminating “Errors that Matter” in Clinical Trials

Utilizing Quality by Design principles to avoid data point overload.

Ann Meeker-O'Connell, Head, Risk Management and External Engagement, Johnson & Johnson, speaks to us about how new principles pertaining to trial design are enhancing quality and improving efficiency.

Q: Could you please give us a background on why you and your team at CTTI decided to start and lead the Quality by Design (QbD) initiative?

A: The Clinical Trials Transformation Initiative is a public-private partnership that was co-founded by the FDA and Duke University. They bring together a very diverse group of stakeholders from across the clinical trial enterprise with the goal of identifying and promoting practices that will increase the quality and efficiency of trials. When I think about the Quality By Design project in particular, we had two key challenges in mind. One was related to trial complexity; there is a lot of data out there, showing that there has been a significant increase in trial complexity with a corollary increase in burden on sites and on patients to carry out the trials. That complexity brings into question whether a trial is even feasible to complete - and can lead to avoidable amendments. There is a really wonderful analogy from Robert Califf, formerly from Duke and now the FDA’s Commissioner, who likens it to decorating a holiday tree; you start with something that’s really simple, really elegant but everyone wants to add on their ornament so you end up with a protocol that’s really challenging to carry out and really distracted from the question you originally intended it to answer.

The other challenge we were thinking about when we started was that of trial oversight; what we felt was a one-size-fits-all approach to oversight with an indiscriminate focus on the accuracy of individual data points rather than focusing on the sorts of errors that might impact patient safety or might raise questions about the credibility of the data. So, we wanted to help shift oversight models from ones focused on salvage to ones focused on prevention.

To address these challenges, we established a set of QbD principles, which we think can facilitate prospective cross-stakeholder dialogue at the time of trial design about what really matters for a particular trial.  And these stakeholders are not only from different functions from within a company, but also including the perspective of patients, investigator sites, vendors involved, et al. This prospective discussion and dialogue can help people focus on which aspects of the trial are critical to quality, and also how they can prevent and/or mitigate opportunities for the kind of errors that matter to decision-making (i.e. “errors that matter”).

Q: How might the QbD model reduce data point overload?

A: Last year, we issued official CTTI project recommendations, and one deals specifically with data point overload; we are collecting vast amounts of data we do not use or don’t know what to do with. Our recommendation states ‘to focus efforts on activities that are critical to the credibility of study outcomes’; essentially we are asking study teams to rigorously evaluate their study design and be able to verify that with the activities they plan, the data collection that’s planned is essential. To the extent that you can eliminate those activities that aren’t essential, you can simplify trial conduct, increase its efficiency and quality, and decrease the burden on patients to participate.

Our Principles document also encourages inquiry into data collection, and that is a tool that supports it in addition to our recommendations.

Q: Do you have a specific example of how QbD has helped to increase efficiency and streamline trial conduct?

A: CTTI did a webinar series to provide tangible examples of how different organizations were applying QbD.  One presentation that speaks to efficiency and streamlining trial conduct is that of The Medicines Company. When they are describing the first step of their QBD process, it’s very simple; it is ‘identify what matters, eliminate the rest’. That includes applying this to data collection, to trial design, and to how the trial is overseen. The presentation specifically highlights that one-fifth of Phase II and one-third of Phase III processes and trials, on average, are collecting non-core data, not associated with primary or secondary import evaluation, with safety assessment, or with regulatory compliance.

In response, The Medicines Company has rigorously evaluated their data acquisition, and the cost of it, to make the case that not only is this a burden on sites and participants, but it comes at a cost to quality, so it makes sense to streamline their trials wherever possible. I think that’s one really good example of how it makes trials more efficient.

Q: Do you see evidence that regulators support the Quality by Design concept?

A: I think there has been strong support from the regulators. When I think about the QbD project team itself there has been strong involvement of both FDA in the US, as well as the European Medicines Agency throughout its life. This indicates to me that they really see this as an area that can add value. I have also heard last spring, at a Medical Device Innovation Consortium meeting, Dr Califf, the new Commissioner, advocating for QbD concepts to be applied to device trials. 

From a more formalized perspective, I see elements of the QbD recommendations in the ICH E6 Integrated Addendum, particularly looking at eliminating things that are non-essential. There is a new quality management section of that document and Section 5 of that proposed addendum states that the trial sponsor should avoid unnecessary complexity procedures and data collection.

We will also, hopefully, have a publication out soon that has FDA and EMA team members among the  authors. I think eventually, QbD will become the norm. From the patient perspective, we can’t really continue to tolerate an approach like what has been going on historically; where something goes wrong, then we are really good at fixing things but that comes at a lot of expense and delays. The shift to prevention is very necessary and across industry you also see, even though it is a CTTI initiative, it reflected in the Transcelerate materials for risk-based monitoring where they acknowledge that risk-based monitoring is essentially one quality control tool that can support an overall QbD approach to trial. People are talking about QbD and using the phrase “errors that matter”; this is now becoming part of the clinical trial lexicon. It may take a little time but I do think it will become the way that people approach trials – it just makes sense.

Ann Meeker-O'Connell will be presenting at Data Quality & Technology in Clinical Trials April 18-19th, Philadelphia.

Data Quality and Technology in Clinical Trials USA

Feb 21, 2017 - Feb 22, 2017, Philadelphia

Improving quality and reducing timelines in clinical trials through the use of technology and analytics