TMF Risk-Based Quality Control: What Does this Really Mean?

Posted by Eldin Rammell | Aug 11, 2020 8:41:30 PM

Trial Master File (TMF) quality control is a topic that always seems to be present on conference agendas and it invokes much discussion and debate. Many clinical trial sponsors and contract research organizations (CROs) are struggling to understand what is needed in terms of quality control and quality review activities and how these can be done more efficiently to improve TMF health and inspection-readiness.

According to ICH E6(R2), quality control is “the operational techniques and activities undertaken within the quality assurance system to verify that the requirements for quality of the trial-related activities have been fulfilled.” This means that to perform overall quality control (QC) we need to embed activities as a) part of a quality assurance system – in other words, under a pre-defined and documented regime – and b) according to specific requirements – that is, with an understanding of the quality level that is required. These QC activities must also include a verification step – Quality Review. This step checks that the quality is acceptable and is an aspect of quality control that is most misunderstood. If Quality Review is performed in an ad hoc manner, then it is not part of a pre-defined regime. Similarly, if individuals have the flexibility to define their own quality activities then it is also unlikely to be done according to specific requirements.

So, how can we best meet these two requirements for effective quality control in a Trial Master File?

ICH E6(R2) requires that QC activities should be focused on aspects of the trial that are essential to ensure human subject protection and reliability of the trial results (§2.13 and §5.0). So, while a 100% complete and accurate Trial Master File is our goal, it is a fact that not all aspects of TMF management directly impact subject safety and data integrity. The guidance goes on to state that the “methods used to assure and control the quality of the trial should be proportionate to the risks inherent in the trial and the importance of the information collected”. The recently published EMA guideline on the content, management and archiving of the clinical trial master file specifically states that the “sponsor and/or investigator/ institution should implement risk-based quality checks (QC) or review processes” (§4.2). It is clear therefore that we can use a risk-based approach to our quality review processes which involves:

  • identification of critical process and critical data;
  • identification and assessment of risks;
  • risk control; and
  • risk communication, review and reporting.

Let’s take a deeper look at each of these as well as some considerations for their effective implementation and management.

 

Step 1: Identification of Critical Process and Critical Data

This step requires an assessment of the content of the Trial Master File to identify those documents or artifacts that we consider to be critical, bearing in mind their potential to adversely impact subject protection and data reliability and integrity. Unfortunately, there is not an industry standard classification available for us; each sponsor needs to make that determination for itself. Think carefully about the purpose of each artifact and how it is used by trial stakeholders. For example, trial correspondence may seem innocuous and perhaps not at all critical, but emails often capture subject safety-related decisions or decisions to approve specific actions or documents. Try not to over-think this process though. TMF artifacts could be categorized into 2-3 criticality groups (e.g. high, medium or low) or you could assign a score (1-10) for the potential impact on patient rights, patient safety or data integrity.

 

Step 2: Identification and assessment of risk

For TMF management, there are three primary risks:

  1. the document does not meet Good Clinical Practice (GCP) and/or internal requirements (poor quality document);
  2. the document is not filed correctly (poor indexing); and
  3. the document is not filed in the TMF (document missing).

In the same way that criticality was assessed in the first step, we can identify risks associated with good documentation practice. One way of achieving this is by assessing the effort that it might take to demonstrate compliance from the TMF records when there are quality issues. For example, is another original document easily identifiable? Is an alternative copy available? Can compliance with GCP be demonstrated via other records in the TMF? These factors will enable us to assign a score based on the effort required to demonstrate GCP compliance. A high impact score means it is relatively difficult to replace missing documents or show compliance by using other sources of information; these are records that would justify particular focus for QC.

We could also group artifacts according to the risk of failure in the above three areas. The most effective way of doing this is by using real-life data to base that classification on. This is also an area where having an eTMF system built specifically for managing Trial Master Files can be a tremendous help. By using a purpose-built eTMF solution that provides metadata for the availability of documents compared to their due date or study milestone, we can generate empirical data for availability of documents in the TMF at the required time. If we see a high frequency for a particular artifact being filed late into the TMF or not being filed at all, we can put that artifact into the high-risk group. Conversely, if our eTMF data shows that an artifact is always filed on time, we can put that artifact in the low-risk group.

Similarly, our eTMF Quality Review workflow should capture QC rejections, including the reason for rejection. By using an eTMF with embedded quality review workflows and management, we can analyze the workflow metadata to identify which artifacts are most frequently indexed incorrectly and the reasons for rejecting other documents. This might include, for example, missing signatures, missing page numbers etc. Based on this data, we can assign artifacts to risk groupings; those with frequent errors in content or indexing will appear in the high-risk group.

We can also take one other factor into consideration in determining risk, namely the lifecycle of the document. Those documents that undergo a workflow activity in the eTMF that includes content review and in-line approval are likely to have a lower incidence of quality issues than a document that has simply been brought into an eTMF as a finalized document at the end of its active life. Similarly, if the sponsor or CRO are really using the eTMF as a study management tool, with study team members actively referencing documents for their day-to-day activities, this in itself will highlight quality issues. In these circumstances, the higher-risk documents are those that are not looked at since they were made available in the eTMF. Once again, the system metadata will help us to identify which artifact types fall into this category.

Defining a QC strategy therefore involves taking account of these four factors:

  1. impact of the artifact on patient safety, patient rights and data integrity;
  2. effort to demonstrate compliance where document quality issues exist;
  3. likelihood of quality issues arising (based on empirical data); and
  4. robustness of document review processes.

It is clear from this that a critical QC success factor is having an eTMF that is able to provide this level of detail in the metadata and enables easy reporting of quality data. In addition to sponsor-specific data from your own eTMF, a select few eTMF providers can also assess an anonymized data warehouse of eTMF metadata to identify quality trends across numerous sponsors. This is especially helpful for organizations that may not have their own historical quality data to review.

 

Step 3: Risk Controls (Quality Review)

Implementing effective risk controls means putting into place a tailored quality review process. So, rather than having a “one size fits all” QC routine, where all documents go through exactly the same process, we can now configure different review activities based on the risk categorization that we’ve performed. Some artifacts will not need a Quality Review if they are not identified as critical and they have a low-risk profile e.g. they’ve been received from Good Manufacturing Practice (GMP) colleagues who have their own internal quality processes. Some high-risk artifacts might need a 100% Quality Review whilst others may justify Quality Review using a sample.

One caution: Many organizations attempt to manage their Quality Review activities using spreadsheets. However, this can be extremely resource intensive, prone to error and also very confusing for the people having to perform these review tasks. At a TMF Summit in Orlando (January 2019) we learned of one sponsor who performed root-cause analysis for their critical inspection findings and discovered a major contributing factor was the use of Excel for tracking their periodic reviews. Your eTMF system should be capable of identifying finalized artifacts due for QC action and tracking those issues through to resolution…. without the need for spreadsheets!

 

Step 4: Risk communication, Review, and Reporting

As with other areas of risk-based QC, eTMF technology can help organizations significantly improve the efficiency and accuracy of communications around the Trial Master File. To take full advantage of this capability entails using an eTMF built on TMF best practices, rather than a general document management system or broad (but not deep) eClinical suite. With a dedicated eTMF, you can manage the various QC activities such that the TMF status, outcomes, and trends can be easily surfaced, reported and escalated, as appropriate – something that most systems and certainly spreadsheets do not do well.

Further, a TMF quality regime should be cyclical, such that the incidence and type of quality control failures go into a feedback loop to drive future Quality Review activities. Artifacts that started out as high-risk may become low-risk over time as a result of timely escalation of QC issues, end-user familiarity with the system and ongoing user education. Again, this is where having a purpose-built eTMF can lead to ongoing productivity and quality improvements, by easily incorporating the latest knowledge and organizational best practices into standard TMF quality review workflows.

By following a true risk-based approach and leveraging new technology capabilities for your Trial Master File, it’s now possible to reduce time and effort while increasing accuracy – optimizing your use of resources to focus on those areas where Quality Review will add real value. The result? Improved and lasting TMF health and inspection-readiness.

Ready to Learn More? 

5 Reasons email header5 Reasons Your TMF Isn't Inspection-Ready (and what you can do to fix it)

Get the Guide

 

Top 5 Inspection-Readiness Mistakes (and how to fix them)TMF Trend Analysis: Top 5 Inspection-Readiness Mistakes (and how to fix them)

Watch On Demand

 

 

This blog is intended to communicate PharmaLex's capabilities which are backed by the author’s expertise. However, PharmaLex and its parent, Cencora, Inc., strongly encourage readers to review the references provided with this blog and all available information related to the topics mentioned herein and to rely on their own experience and expertise in making decisions related thereto as the blog may contain certain marketing statements and does not constitute legal advice.

Topics: TMF, Outsourcing

Smart Steps to Managing Your TMF Audit Trail

The term audit trail can be daunting for companies, but this is a regulatory requirement. Simply put, when it comes to ...

Read More

Lowering the TMF temperature through quality reviews

We are thrilled to announce Season 2 of our exciting webinar program, Summer Shorts: TMF Excellence Edition! Avoid ...

Read More

Five-step Game Plan for a TMF Close-out

Close-out of a clinical trial raises many questions about responsibility and management of the trial master file (TMF). ...

Read More

Staying on Top of the Trial Master File

As regulatory agencies increasingly focus on the processes and workflows behind the Trial Master File (TMF), not just ...

Read More

How AI Can Support the Trial Master File

Full adoption of artificial intelligence might be some way off, but industry is starting to embrace the opportunities ...

Read More

Subscribe To Our Blog!


Digital Brain Header Large Brain Right

It's time to raise your standard 

CONTACT PHLEXGLOBAL TODAY
 
Contact Us