CSV to CSA: A Paradigm Shift in Computerized Systems’ Approaches – or not?

Blog Post: "CSV to CSA: A Paradigm Shift in Computerized Systems' Approaches - or not?
Charlie Wakeham has more than 20 years of industry experience developing and validating computerized systems for regulated production and laboratory environments. An active ISPE GAMP® member since 1999, she serves on the GAMP Data Integrity Leadership Team and GAMP Global Council and has contributed content to multiple GAMP Good Practice Guides. Charlie is Waters Corporation’s Regional CSV Consultant, where she works directly with Waters’ customers in the Asia Pacific region to provide efficient, practical assistance with the validation of their laboratory Informatics computerized systems.

Opinions expressed by CSols Inc. contributors are their own.

In this blog, Charlie examines the Computer System Assurance initiative from the U.S. Food and Drug Administration (FDA) against the long-established CSV approach. For more blog information from Waters experts, click here: blog.waters.com.

No more validation.

CSV is a thing of the past.

These are just some of the attention-grabbing headlines I’ve seen in various blogs and LinkedIn posts over the last 12 months about Computer System Assurance (CSA), and the reality couldn’t be more different.

The U.S. FDA originally targeted their Case for Quality initiative (2011) at improving medical device quality and usability for patient safety. More recently, FDA’s Center for Devices and Radiological Health has sponsored an industry pilot team1Grateful thanks are given to the members of the CSA industry team whose materials this blog has leveraged. under the CSA title to improve Computer System Validation (CSV) approaches, as the existing practices were identified as a barrier to the adoption of new technologies.

But, really, how much difference can there be between Validation and Assurance?

Evolving from Computer Systems Validation into Computer Systems Assurance

Underneath all the hype and excitement, in practical terms, CSA is fundamentally the risk-based approach promoted by the FDA in 2003 and applied to CSV by GAMP 5 in 2008. We’ll look at the pros and cons of the current CSV methodology and understand where industry has failed in applying critical thinking and risk-based approaches.

By understanding the CSA principles, we can then move on to embrace the benefits of CSA, such as:

  • Efficiency gains: exchanging a large proportion of the time previously spent writing test documents for time now spent actively testing and challenging the system
  • Better detection of software faults in the system and non-conformances to the URS before handover to operational use
  • Improved protection of patient safety, product quality, and data integrity

Additional Reading: CSV is Not a Commodity

We Already Have Computer Systems Validation…

Properly applied, CSV uses a documented risk assessment to focus effort on the highest-risk system functionality, and uses scripted testing to demonstrate that those risks are mitigated by the system configuration and applied controls. It provides documented evidence that the system is fit for its intended use within a regulated environment and will protect patient safety, product quality, and data integrity.

The Problem With CSV

That’s what properly applied CSV achieves. The issues I’ve encountered too many times over my career in CSV include:

  • A lack of understanding within the regulated company, as in “we want to have a compliant validated system but we’re not sure how we’ll use it”; it is impossible to validate for intended use until the intended use is defined and understood
  • Regulated company project teams deliberately manipulating the risk assessment towards higher ratings, to force more testing during a CSV contract
  • A test-everything approach, “just in case”
  • Unnecessary demands for screen capture or other hard-copy evidence on every test step, “so we can be sure the tests were done properly”
  • Excessive levels of document review and approval signatures, typically by individuals with no knowledge of the CSV process nor of the system it’s being applied to

These issues are not failings of the CSV risk-based approach so expertly laid out in ISPE GAMP 5; rather they are failures of industry to apply this approach effectively.

Understanding CSA Principles

CSA requires the use of critical thinking, whereby facts are analyzed impartially to identify patterns, to assess trends and theories, and to evaluate outcomes.

White Paper: "Data Integrity Checklist"

This critical thinking should be combined with the risk-based approach to ensure the focus of effort is on the system, software, or function that can directly impact patient safety, product quality, and data integrity. Understanding of the potential impact is aided by defining the business process and data flows, and assessing the risk based on the overall business process, not just on an individual step in a system; e.g., are there further controls downstream that would highlight an error that occurred?

The key to CSA is that credit is given for all testing, including:  

  • Supplier testing during the system development lifecycle should be leveraged for assurance that the software functions correctly against vendor specifications. Supplier assessment and/or review of their internal test practices are essential precursors to reliance on this.
  • Unscripted testing (testing without written test step instructions) typically relies on the tester’s experience to create test cases dynamically and is effective at finding software faults under intended use conditions. There should be a summary description of the items tested, who performed the testing and when, and details of any failures found.
  • Limited scripted testing should be used for medium to high risks, with expected results and an indication of pass/fail as well as a summary description of the items tested, who performed the testing and when, and details of any failures found.
  • Robust scripted testing is reserved for verification of high-risk functionality, as determined by critical thinking and risk assessment, and requires the formal pre-approved test protocols so widely used in CSV.

Test activities must add value to the assurance that the system is fit for purpose. Test efficiency can be further enhanced by:

  • The use of automated test tools,
  • Digital capture of test data and test evidence, used only when necessary, and
  • Leveraging electronic management and control of test documentation, allowing for it to be efficiently generated, approved, retained, reviewed, and re-used while ensuring it is generally effective in the identification of defects and in the demonstration of fitness for intended use.

Implementing the New Approach for a Chromatography Data System

The FDA Case for Quality program started with Medical Devices, but the CSA approach is now being applied successfully to computerized systems used in other regulated areas, e.g., pharmaceutical organizations.

In the context of a Chromatography Data System – used in a pharma QC lab with a high direct impact on patient safety and product quality – CSA would allow the focus to be on:

  • Security and access controls
  • Methods and calculations
  • Workflow and electronic approval routings
  • Data transfer to and from other systems

A combination of ad-hoc unscripted testing for error guessing and intended use verification, and robust scripted testing to demonstrate reliability for the highest-risk items, would be used for these functions.

Potentially the application views and menu navigation, the ability to create custom summary reports (which are neither original records nor complete data and therefore should not be used for decision-making), the management of pre-defined reasons, and other typically lower risk functionality, will leverage the supplier development testing. If standard functionality has been adequately tested by the supplier, there is no need to repeat any testing for those aspects. The need for any additional scripted testing can be evaluated by considering factors such as simplicity/complexity, novelty, detectability, and degree of configuration or customization, as well as any process-specific risk factors.

Conclusion

Computer System Assurance is not a replacement for, nor a contradiction of, current Computer System Validation approaches as defined in GAMP 5 but rather a reinforcement, or restatement, of the GAMP 5 key principles of product and process understanding, quality risk management, and leveraging supplier activities. CSA combines risk-based testing with risk-based documentation. It is the risk-based approach “on steroids”, and my hope is that CSA provides the momentum for regulated companies to finally, effectively, embrace that risk-based approach for the ongoing safety of their patients.


Do you believe that less documentation will result in a lower quality of validation?

Would you feel confident to face a regulatory inspection with formal test documentation covering mainly the high-risk functionality, with limited scripted testing or even no testing of other functionality?

Share Now:
Categories:
Comments

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.