The life sciences industry relies on statistical programming software companies like SAS and Minitab for
manufacturing process and supply chain optimization and regulatory submissions. These tools enable
process developers to conduct complex modeling, generate reproducible reports, and comply with
regulatory requirements.
Statistical analysis provides a backbone for establishing critical process parameters and helps bring
variability to heel. For regulatory submissions, it offers evidence for process validation and reproducibility.
However, consideration of statistical software as a GxP computerized system is often overlooked. This
neglect can have significant implications, given the recent increasing focus on data integrity.
Statistical programming tools in which models are custom coded and are more flexible are becoming
popular, in place of software that provides models out of the box. These statistical programming tools
generally mean that a programming platform must be installed and qualified before a statistical model can
be developed and validated for intended use with GxP data. For example, GAMP 51 has identified that
statistical programming tools can be identified as category 1 infrastructure software but excludes the
business applications such as statistical models developed using these packages.
In addition, many statistical software programming tools are moving to the cloud and leveraging more
complicated statistical models enabled by artificial intelligence and machine learning.
To assist you in ensuring data integrity and reliability for the statistical analysis of GxP data, we will guide
you through the top 10 data integrity requests an auditor might make during an audit of a statistical
software system.
Top 10 Data Integrity Requests for Statistical Software
-
GxP computerized systems inventory list
An inventory list will immediately indicate whether the statistical programming software is within the scope of GxP practices, some common details such as GAMP category, ownership such as the business and systems owners, system release date for use and risk category. Specific details might include if there is a programming platform is independent of the business application/statistical model. -
Governance SOPs for data, validation, and risk
Standard operating procedures (SOPs) can be reviewed for the following:
- Control and management of records, demonstrating ALCOA+
- GxP data review processes
- GxP categorization for computerized systems and how this is documented
- Overall risk management of GxP computerized systems
- Validation Policy
-
SOPs for statistical model generation and validation
Key areas of interest include:
- Data governance practices, from collection to controlled storage and maintenance
- Procedures for managing and reviewing statistical models
- Validation processes for the statistical accuracy of model calculations
- In AI/ML models, what additional considerations are around training data, and to prevent issues like overfitting or bias
-
Validation report for statistical models
A validation report details the model's intended use, testing outcomes, and any referenced risk
assessments. Where the software provides statistical models out of the box as standard, the validation report of the software and the model may be together. There may be separate validation reports if the model is custom programmed by the business on statistical software. -
Requirements Traceability Matrix (RTM)
Demonstrates how user requirements for statistical models were captured and tested, linking them to validation steps and ensuring traceability. -
Evidence of model and code review
Where statistical models are built on statistical programming software, code review processes would help determine:
- The statistical basis is sound
- Code is version-controlled and maintained
Unit tests are performed as required (particularly for independently developed / open-source components).
-
SOP(s) for software administration
Software administration practices and controls include software release management, vendor
management, patient data security and the protection of human subjects, user access controls,
backup/restore procedures, business continuity of data, and periodic reviews. -
Evidence of backup and restore capabilities
The presence of successful backup and restore evidence is crucial to prevent vulnerability to data loss. -
Quality management system (QMS) review
The QMS review covers quality records related to statistical software, including deviations, corrective
actions, and change management. -
Last completed periodic review
This looks at evidence of regular evaluations of your statistical software, confirming it remains compliant and fit for purpose. Periodic reviews will inform whether risk categorization is still correct or if additional controls now need to be applied.
Conclusion
The No. 1 FDA finding is lack of procedures, so, as part of a data integrity audit, ensure you have the
requisite procedures to ensure that your data is controlled.
Preparing for a statistical software audit can be daunting, but a systematic approach ensures compliance
while supporting the reliability of your drug development processes.
ERA Sciences, with its expertise in digital GxP compliance, offers tools like Phanero to streamline inventory management and simplify audit preparation.
References:
- GAMP 5, A Risk-Based Approach to Compliant GxP Computerized Systems, Second Edition
Comments