Data Integrity – What are the Biggest Challenges

Original Article posted on Technology Networks, February 1, 2019 by: Ruairi J MacKenzie
Community: Informatics  |  Article # 314887

Data Integrity - What are the biggest challenges?

Ensuring that data is reliable, reproducible, and therefore valuable means companies in both regulated and non-regulated spaces must keep data integrity at the heart of their informatics operations. Laboratory informatics consultants CSols, Inc have been teaching their clients data integrity best practice since 2001. We talked to John Zenk, Validation Services Manager at CSols, to find out what the biggest concerns in data integrity are and how companies can take steps to overcome these challenges.  

Ruairi Mackenzie (RM): Can you tell me a bit about CSols Inc. and your role there?

John Zenk (JZ): Sure. CSols consultants are experts in laboratory informatics, within both the regulated and non-regulated spaces, helping organizations in all industries, from planning and selecting the right informatics solution, to implementing and enhancing them in laboratories. Within my group, we validate informatics systems across all regulated environments. My role in particular is Director of Software Validation services at CSols. The group that I manage handles all standard software validation activities for Laboratory Information Management Systems (LIMS), chromatography data systems (CDS), and general benchtop, standalone laboratory software with attached instruments.  Our approach is based on industry best practices (e.g., GAMP, PIC/S, etc.), regulatory requirements, data integrity requirements, and our experience as leaders in the informatics industry.

RM: What are the major data integrity issues that are facing companies in this space from your perspective?

JZ:  The Data Integrity and Compliance With Drug CGMP; Questions and Answers; Guidance for Industry, released by the FDA in December 2018, addresses the current data integrity concerns. Lately, we’re seeing most data integrity issues revolving around audit trails.  It is commonplace now that laboratory software systems have audit trail functionality, especially the leading top-tier systems. But what we are helping many of our clients determine is what to actually do with those audit trails. Most of the time labs fall into a routine of just setting it and forgetting it, where they’ve set up their audit trail, they know it’s there, but they don’t revisit it. Do they actually know what’s being captured and what isn’t? Who’s reviewing the audit trail, and how often is it getting reviewed? Is it part of data that’s being submitted to a quality group for review?

A second area we commonly deal with is security and access. We still see some labs using a single shared laboratory account to access their systems. This is mainly due to a lab being small, but constantly running a lot of data. And so, for ease of use within the lab, a single shared account is used. However, having one shared account is not traceable to who performed which activities in the system.  Therefore, we help them understand why it’s not best practice for data integrity and help them figure out how to better control security and access.

The third big issue we find is labs incorrectly implementing a risk-based approach for validating electronic workflows being used within the lab.  The new guidance places more stress on verifying that all the workflows are correctly configured and actually work rather than saying, “Well, we’ve put our most complex workflow through the system and shown that it works, so therefore less complex workflows should work as well.”  Labs must ensure their computer system has been validated for its full intended use, otherwise it’s not possible to know if all workflows will run correctly.

RM:  But what can companies and organizations do to tackle these challenges, both using informatics systems and outside of digitization?

JZ:  I think understanding the policies, procedures, and what guidance documents exist is a good first step. A common scenario we’ve seen is when a client knows what types of SOPs or work instructions they need because they use the system; but, over time as they’ve either changed the system or moved from one system to another, they may not have spent enough time looking at those SOPs and understanding what their procedures say about things like audit trails, what their data are, and the processes they have in place for different types of data.

Understanding metadata is also critical.  With systems becoming much more configurable on the user end, it provides end users and labs new flexibility to tailor a system to their intended use.  But you need to take that extra step and be aware that more control over a system means you have to know more about the metadata – more about what’s being configured and what data are linked to other data.

RM: Are there any other issues around data integrity that companies maybe don’t consider?

JZ: I think the next step is not only ensuring system-specific data integrity processes exist, but more generally how software validation and data integrity are being addressed at a corporate level from the start. It’s one thing to take a tactical approach as issues arise, but it’s also wise to take time and develop a strategic approach and assess a master validation plan.  Companies should ask, “Does our QA group understand what’s involved in software validation?  Do our end users and lab management understand what it takes to ensure a system is fully validated?”  Taking a one-off or system-specific approach is doable, and definitely helps. But, with that approach, companies might possibly miss the real issues on a much larger scale.