The process of software validation ensures that the system fits its intended use and functions as it should. Computer system validation (CSV) for laboratory informatics is essential because regulated businesses must ensure the safety of their products for consumers, and their laboratory informatics systems (LIMS, ELN, CDS) are an integral part of that. Given its importance, validation tends to be seen as confusing and challenging to execute correctly. Of course, it is possible to do it right, and CSols has proof. In almost 30 years of doing CSV, we haven’t heard of any of our clients receiving an FDA 483 form.
Validation is part of the software development life cycle. In this blog, we’ll review what that means and how to do it so that your system will be defendable in a regulatory audit.
No discussion of computer systems validation is complete without an overview of the legislation around it. In the United States, the Food and Drug Administration (FDA) regulates specific industries that directly impact consumer health, including pharmaceuticals, cosmetics, and food and beverage. These industries have an added responsibility to ensure their products are safe and their data are secure. The relevant legislation addressing aspects of computer systems validation in the United States comes from the Code of Federal Regulations (CFR), most specifically 21 CFR Part 11 (Part 11), dealing with electronic records and signatures. Similar government agencies and regulations apply in other countries as well.
Part 11 mandates the requirements for electronic records and signatures to be considered accurate, reliable, readily retrievable, and secure, to replace paper records and handwritten signatures legally. Validating your computer system is the primary means of determining that electronic records and signatures can be used in this way.
Validation can take many shapes during the computer system life cycle, depending on whether it is a new implementation or an upgrade to an existing system. For new systems that the user hopes can solve a current problem, validation happens from the ground up. For an existing system that needs an upgrade or is expanding the scope of its intended use, the need is to keep the system in a validated state by testing the new capabilities before releasing them into production use. The validation process ends when a system is retired and its data are successfully migrated to a new system or archived. The figure below shows how validation supports the project life cycle.
Your validation master plan guides you through the validation process and becomes a sort of a check-off list to ensure that everything happens as it should. Once you’ve assessed the As-Is state of your system, the validation master plan encompasses all the other steps you’ll take to ensure your system is validated in its current state and fit for its intended use.
The validation master plan should account for requirements gathering, a functional risk assessment, a trace matrix, IQ OQ PQ protocols and testing, and change control procedures with periodic reviews. Each part of the validation master plan is executed in a defined order. Your requirements should be complete and the risk assessment done before you move on to developing the trace matrix and then doing the testing. This way, you minimize the risk of having to go back and develop new test cases late in the process.
It is critical to ensure that your requirements and specifications are well defined and approved before validating the computer system. The validation V-Model is commonly used to visualize the relationship between requirements and specifications and the testing performed on them (see diagram below). Qualification testing (down the right side of the V) is designed based on your intended use and the functionality required to meet that use (represented down the left side of the V).
IQ/OQ/PQ testing is arguably an essential part of the validation process. Successful completion of the testing will verify that your system functions as intended and is fit for its intended use in your environment. It is best practice to approve your user requirements and functional specifications before testing to avoid scope creep and possible re-testing.
To learn more about IQ OQ PQ testing, watch our webinar.
The people writing and executing your IQ/OQ/PQ testing should be thoroughly familiar with your lab informatics system (LIMS, ELN, CDS) and your intended use. If your in-house staff does not have the bandwidth or experience for proper testing, you should work with qualified CSV consultants, like CSols, who have the requisite experience with your informatics systems.
The importance of laboratory informatics data cannot be understated. When you have data in a validated environment, you need to ensure that your data remain secure and reliable. The acronym ALCOA identifies the five basic principles of data integrity: data must be Attributable, Legible, Contemporaneous, Original, and Accurate. More recently, four additional principles have been added, so that the acronym is now ALCOA+. The four additions are Complete, Consistent, Enduring, and Available.
Data integrity is integral to all validation activities. Following the principles of ALCOA+ ensures that your system captures, produces, reports, transfers, and stores data that are secure, retrievable at will, and reliable.
Although we are still waiting for the FDA to release their expected guidance about computer system assurance (CSA), it is coming. At its core, CSA reinforces a risk-based approach that expands on the GAMP 5 principles of product and process understanding, quality risk management, and leveraging supplier activities. Risk is assessed based on the big picture of the overall business process. Doing so places more emphasis on test efficiency, focusing on testing that ensures the system is fit for purpose.
Validation of computer systems can involve challenges, including the risk of system failure, restrictive company policies, and increasingly stringent regulatory requirements. Another significant issue is when users need to balance the risk vs. cost equation after risk categories are defined. A risk-based approach to CSV can help to mitigate some of these challenges.
Additional steps you can take to avoid validation problems include the following:
Properly executing a computer system validation is an involved process, but you can do it when you have the right expertise. If you aren’t confident that your in-house staff has the necessary CSV experience, reach out to us.
Is there anything else you’d like to know about computer system validation that hasn’t been addressed here? Comment below, and we’ll get you an answer.