If system manuals and training materials do not reflect the installed version of the system, work process mistakes could happen with its use. If work process SOPs do not reference the work steps where the system is to be used, mistakes could happen to reduce product or data quality in the GXP work process. (30)


Passing audits or inspections is the LAST reason why anyone should want to validate a computerized system.  Making sure the system operates to perform the work activities that users actually need it to do is the FIRST reason to validate. Checking that the system is reliable in handling and protecting data for both normal circumstances and for anticipated problem situations in the GXP work process is the SECOND reason. Verifying that the system works as expected with network communications and interaction with other systems and infrastructure is a THIRD reason to validate.  Reasons one, two, and three are directly related to return on investment (ROI) for the cost of the system. After ROI there is the FOURTH reason to validate – compliance for audits and inspections.

As discussed in Chapter 1 of the book, all regulations seek assurance (i.e., evidence) that the use of computerized systems results in al least the same quality and integrity of data and product as came from prior systems or manual methods. Management, of course, wants more for their ROI. They want improved capacity and profitability from new systems that work as intended from go-live on day one. The end users themselves want a more efficient work process with less tedious manual operations and more control of their output without unplanned service interruptions. For their part, the IT department wants to checkout technology before putting it into production, so that any initial issues can be resolved without a riot of angry users breathing down their necks. A common sense approach to computer validation meets all these needs.

It is important to have SOPs, Work Instructions, and templates/forms that define a consistent way to perform computer validation across the organization. The methodology should be flexible enough to allow for different levels of documentation based on the size and complexity of the target system. The SOP or Guideline on formal testing practices, however, should be used with rigor across all systems to insure that test documentation will meet audit and inspection standards. Tester resources should not be wasted by producing substandard evidence. Test developers and testers should be properly trained in the SOP practices and relevant template/forms and work instructions (WIs) at the start of a project so that they know what audit and inspection standards apply to their roles.

Validation also requires that users map the system role into their regulated work process. This means a review of user SOPs and WIs to verify that use of the system is correctly referenced in related work activities. Audits and inspections often find that this step, to be performed by end users, can get overlooked and that systems go live without adjusted work instructions and system training records. Validation is collaboration and user management needs to document how the system is compatible with their work process just as much as the IT department does for its infrastructure.


Next Month: We test our systems all the time. Why do we have to document it?

Formal testing is performed in a documented manner that is traceable back to a management approved unique requirement or specification item…Testing without formal documentation is meaningless for audit and inspection purposes and a waste of precious resources for GXP systems. (89)