Tips to Ensure Data Quality When Implementing or Optimizing a CTMS

Ryan Monte
Senior Product Marketing Manager, Forte
June 13th, 2019

Note: This article summarizes a session led by Christopher Thomas of Forte, presented at Forte’s Onsemble Conference. To learn more about the conference, visit conference.onsemble.net.

When implementing a new clinical trial management system (CTMS), or optimizing a current system, data quality is often top of mind for clinical researchers. Creating processes that result in accurate, timely and complete data are invaluable and lead to more efficient workflows and reduced anxiety related to reporting. But quality data is not an accident; it’s the result of rigorous planning and execution. Below are three key areas of focus when establishing a data quality initiative.

Expectations and Planning

One common mistake research institutions make is attempting to address all of their data quality issues at once. When implementing a new CTMS, there’s often pressure to justify both the cost and resources needed for the project, leading to unrealistic expectations. Instead, your organization should prioritize goals, targeting your biggest needs first. A cost/benefit analysis may be needed to determine if certain data points are worth tracking at all, especially if they affect staff bandwidth.

Establishing this “minimum footprint,” or minimum amount of functionality and processes needed to go live, is key to creating a successful program. When prioritizing specific data points, here are a few questions to ask:

  • Does the data support larger organizational goals?
  • Is the data relevant to multiple areas (e.g. Cancer Center and AMC; IITs and Industry Studies)?
  • Is the data collected for every protocol (e.g. protocol number and title, PI, IRB number, etc.)?
  • Does the data available/collected require other data to be present?
  • Is the data actionable? If not, should it be a priority?

Prioritizing goals based on the questions above will make it easier to collect data accurately and consistently, more effectively communicate the purpose of these data points to your staff and ultimately reduce time to value with your system.

During our free, upcoming webinar, hear expert presenters address common concerns, provide best practices for building a successful data integrity framework and share their journey to establishing organizational data integrity. Register today! 

Evaluating Your Collection and Monitoring Processes

Once you’ve identified your most important data points, it’s time to review your methods for data collection and monitoring. Bad data is often more harmful than no data, so it’s essential to establish these processes well before your go-live date. Monitoring needs to be clearly defined, both from a personnel perspective (i.e. who is ultimately responsible for data integrity?), and a process perspective (i.e. what tools and reports can this person use to ensure processes are being followed?).

This point in the process is also an ideal time to review the processes of other teams throughout your organization. While standardizing your collection and monitoring workflows throughout your organization isn’t a necessity, it can help boost data consistency across teams, especially if those teams are using the same CTMS.

Tools to Ensure Ongoing Success

You’ve established your minimum footprint and defined your data collection processes. The next step is to provide your staff with the tools to maintain these workflows. As your organization evolves and as staff inevitably turns over, well-defined SOPs and documented goals will ensure data quality remains top of mind. Many organizations review these processes during onboarding, but it’s also important to make instructions available where existing staff conducts their day-to-day work. This can be within the CTMS itself, on your organization’s intranet, reviewed face to face in meetings or any other place frequented by team members.

Some systems can also be set up to run discrepancy reports to validate your data. Tools such as Forte Insights can help identify missing data for both protocols and subjects, and prioritize those errors to let you fix the most important elements first. Forte’s OnCore Enterprise Research System provides a great solution for large research organizations looking to improve the quality of their data. OnCore is a standardized system used by leading researchers across the country, with integrated standard reports and workflows that can help improve your processes. These are built in close collaboration with members of Forte’s customer community. Click here to learn more and to request an OnCore demo.

If you’d like to learn more about data quality and maintaining data health at your organization, register for our upcoming webinar Building a Foundation for Quality Clinical Research Data, Reporting and Analytics, Tuesday, June 18th at 12pm Central Time. 

No Comments

Leave a Reply

Your email address will not be published. Required fields are marked *