07. Auditable Formal Testing Practices
Testing without formal documentation is meaningless for audit and inspection purposes and a waste of precious resources for GXP systems. (89)
Testing for any OQ, IQ, or PQ package must be performed under a management approved Test Plan that fits into the context of a management approved Verification Plan for a supplier OQ package or Validation Plan for end user IQ/PQ packages. If testing occurs after Go-Live, it can be performed under an Ongoing Test Plan established by the Validation Plan or in a documented formal Change Control SOP process.
Roles and responsibilities for formal testing should be set up to prevent “conflicts of interest.” A person writing code cannot be the formal tester of that same code. The author of a test case/script cannot be the approver or tester of that same test and it is best to also not be the reviewer/witness to the same script. A system that has one person writing the Validation Plan, the Test Plan, all the Test case/scripts, and performing the tests is not a “validated” system, no matter how much paper is produced.
For every test case/script executed, at least one piece of system evidence must be produced, e.g., screen shot, printout, system generated value recorded to show that the system responded. The evidence document must be tester initialed, dated, and a label applied to trace the evidence to the test step that produced it as there may be many other steps performed in the test. The goal is not to flood the effort with screen shots of every step, but to capture evidence at strategic steps that show all prior steps were properly executed in order to achieve the captured output.
The requirements being tested should be listed on each test case/script and the testing site recorded, e.g., HQ, Lab, EU, US, Device. The Test run number should also be recorded, e.g., 1 for first run of script, 2 for first repeat run, etc. There should also be a space for the Tester to record unexpected events happening during the testing and for the Test Coordinator to record resolution of those events. Examples could be “coffee spilled on keyboard/ new keyboard connected and testing continued” or “tester became ill/ tester sent home & test aborted to be restarted another time.”
Auditors and inspectors know that testing can be a complex and messy process for large systems in particular. In fact too clean a set of manual test documentation raises a flag of suspicion that perhaps it is not original. It is important that the evidence tells the whole story of testing – warts and all.
For end user PQ testing the work process itself should be used for testing the system in three flavors: Vanilla (normal work run with normal data); Chocolate (problem work run with expected data and/or process issues); Strawberry (other problem and special issue/stress case run).
Next Month: IT Infrastructure Role in Validation
It is important that IT staff protect themselves from being given the tasks that end users should perform in validation, because the work process and its GXP data are not fully known by IT personnel… Separate from the Software Application URS a Platform Requirements Specification (PRS) should be defined. (153)