What does Silly Putty have in common with Computer System Validation (CSV)? A thought-provoking question, but one that points out how much we don’t know about the origins of many things we use daily.
Silly Putty and CSV are both examples of innovation driven by military applications that are now part of our daily lives. For instance, silly Putty was invented during World War II when the Japanese army cut off the United States from its supply of natural rubber, which, at that time was a crucial material. So, a chemist at General Electric invented a potential substitute from a mixture of boric acid and silicone, which turned out to have no military value but eventually gained a place in pop culture as Silly Putty. Similarly, CSV (and tomorrow’s Computer Software Assurance) originates from a process developed by the military, called independent verification and validation (IV&V) of software. This process, developed in the 1960s, was first implemented on a large scale by SAIC for the Safeguard Anti-Ballistic Missile System in 1971. Today, software verification and validation is known as the Institute for Electrical and Electronics Engineers (IEEE) standard P1012.
Let’s take a short trip through history since the 1970s to look at why computer software validation became so important, and how it might be shaping the future.
From anti-ballistic missile systems to smartwatches taking ECG readings on our wrists, computer software has become indispensable to our daily lives. And the importance of verification and validation (or assurance) of it has also increased. In the early days of software validation, systems that ran on computer software were large and complex. Therefore, the stakes for proper outcomes were equally high. For example, we wouldn’t have wanted missile control systems that run on computer software to malfunction (although they do!), nor satellites and rockets to crash back to Earth. Today, proper testing and validation do far more than keep your Netflix password safe—they also keep the underlying electrical grids running and ensure that planes, trains, and automobiles safely get where they need to be.
In a scientific setting where we use laboratory informatics software, CSV ensures that a computerized system fits the intended use and functions as designed. And any time that system is upgraded or customized, we validate the change to ensure that the new functionality has not impacted any existing functionality.
Unfortunately, due to some fatal mistakes, computer system validation became essential. The most notorious was probably the Therac-25 incident in the mid-1980s, in which a radiation therapy machine for cancer patients malfunctioned. Previous versions of the device had relied on hardware for safety controls, but the Therac-25 was the first to rely on software for its safety protocols. However, after the machine mistakenly irradiated at least six patients with far more energy than required, resulting in several deaths, inadequate software testing was to blame. For example, if a technician selected the wrong operating mode, they could correct their mistake quickly before starting the treatment, but the software didn’t recognize the corrected input.
Another incident in which software errors turned deadly was the failure of the U.S. Patriot missile defense system to intercept a Scud missile during the Gulf War in February 1991. Rounding errors in the software compounded over long operating times. These errors built up and reduced the precision of the defense system’s ranging and targeting. At the time of the missile strike, the system had been operating for more than 100 hours. Because of the error, it was unable to intercept the incoming missile. Technicians noticed the rounding error, and they developed a fix, but the new code did not reach the system until the day after 28 soldiers lost their lives in the attack.
Sometimes, mistakes are just expensive. For example, in 2012, a piece of orphan code called a disabled function in a brokerage firm’s systems was reactivated when someone saved new code to the wrong location. In addition to losing more than US$400 million in just 45 minutes in the trading system breakdown, Knight Capital was also charged US$12 million by the Securities and Exchange Commission for failing to have proper risk management controls in place for its automated trading system.
Unfortunately, it isn’t too hard to find examples of software failures with minor to catastrophic results. It’s a fact that software failures are inevitable and often costly, which is the chief reason why computer software needs to be validated, especially in regulated laboratories where we manufacture products for direct patient use.
The functions that we entrust to computer software are ever more complex and personal. When that information is compromised or held hostage by security breaches, the transactions that are an integral part of our everyday lives become difficult or impossible. Validation or assurance of laboratory informatics systems and the software that underpins them is always vital because it can help avoid such breaches by ensuring that the software functions as it should and that checks and controls are in place to prevent improper use. The U.S. Food and Drug Administration (FDA) has recently placed more emphasis on data integrity. Many of their citations around this topic have to do with laboratory informatics systems or instruments.
As mentioned earlier in the blog, another direction for computer system validation is computer software assurance, where taking a more risk-based approach to testing may become the preferred and encouraged method. There are several resources in this blog about computer software assurance.
Additionally, the FDA is pushing for more regulation in new industries, like nutraceuticals and cannabis. Some might see the validation of laboratory informatics systems as low stakes in the grand scheme of things. It’s not a life-or-death matter, after all. However, your laboratory informatics system is key to your business. Validation is a significant part of the cost of any system, and knowing that yours is fit for its intended use is vital to making sure your lab can continue to do its work. CSV and software assurance work has a long and important history and will continue to provide value to organizations as long as we continue to use computerized systems.
How has system validation or software assurance made a difference to your organization? Tell us about it in the comments.