AI Audit Checklist
This checklist provides a structured framework for auditing AI systems across six key categories. Each item includes the required evidence for demonstrating compliance. Use this as a baseline and adapt to your organization's regulatory requirements, risk profile, and AI maturity level.
Data Governance
| Audit Item | Evidence Requirements | Status |
|---|---|---|
| Data inventory and classification | Data catalog with classification labels; data lineage documentation; data owner assignments. | Required |
| Data quality management | Data quality metrics and thresholds; data validation procedures; quality monitoring dashboards. | Required |
| Consent and legal basis | Consent records; legal basis documentation for each data processing activity; data processing agreements. | Required |
| Data retention and deletion | Retention policy; deletion procedures and logs; evidence of periodic data review and purge. | Required |
| Training data documentation | Training dataset cards; data source provenance; annotation guidelines and quality reports; bias assessment of training data. | Required |
Model Documentation
| Audit Item | Evidence Requirements | Status |
|---|---|---|
| Model card or factsheet | Completed model card per NIST/Google/IBM template; intended use, limitations, performance metrics, ethical considerations documented. | Required |
| Architecture and design documentation | System architecture diagrams; model architecture description; feature engineering documentation; hyperparameter records. | Required |
| Version control and change history | Model version registry; change logs; training run records with reproducibility information (seeds, configurations, data snapshots). | Required |
| Deployment documentation | Deployment architecture; infrastructure requirements; rollback procedures; A/B testing methodology (if applicable). | Required |
| Decommissioning plan | End-of-life criteria; data disposal procedures; stakeholder notification plan; transition procedures. | Recommended |
Testing and Validation
| Audit Item | Evidence Requirements | Status |
|---|---|---|
| Pre-deployment testing | Test plans; test results with pass/fail criteria; performance benchmarks against baseline; edge case testing results. | Required |
| Adversarial and red-team testing | Red-team test plans and results; adversarial input testing; prompt injection testing (for LLM systems); robustness evaluation. | Required |
| User acceptance testing | UAT results from intended user population; usability assessment; feedback documentation and resolution. | Recommended |
| Regression testing | Regression test suite; results from each model update; comparison with prior version performance. | Required |
Bias and Fairness Monitoring
| Audit Item | Evidence Requirements | Status |
|---|---|---|
| Fairness metrics definition | Selected fairness metrics (demographic parity, equalized odds, etc.) with justification; threshold definitions. | Required |
| Pre-deployment bias assessment | Bias evaluation results across protected characteristics; disaggregated performance metrics; identified disparities and mitigations. | Required |
| Ongoing bias monitoring | Monitoring dashboard or reports; alert thresholds; frequency of assessment; drift detection for fairness metrics. | Required |
| Bias incident response | Documented incidents; root cause analysis; corrective actions taken; evidence of resolution effectiveness. | Required |
Incident Management
| Audit Item | Evidence Requirements | Status |
|---|---|---|
| Incident reporting procedure | Defined reporting channels; severity classification criteria; escalation procedures; response time SLAs. | Required |
| Incident log and tracking | Centralized incident register; status tracking; resolution documentation; trend analysis. | Required |
| Root cause analysis | Completed RCA for significant incidents; contributing factor identification; preventive action plans. | Required |
| Lessons learned | Post-incident reviews; knowledge base updates; policy or procedure revisions triggered by incidents. | Recommended |
Third-Party AI Vendor Assessment
| Audit Item | Evidence Requirements | Status |
|---|---|---|
| Vendor due diligence | Vendor risk assessment questionnaire; security certifications (SOC 2, ISO 27001); AI-specific governance documentation. | Required |
| Contractual protections | Data processing agreements; liability clauses; SLAs for performance and availability; right to audit provisions. | Required |
| Model transparency | Vendor-provided model cards or documentation; performance benchmarks; known limitations and failure modes. | Required |
| Ongoing vendor monitoring | Periodic reassessment schedule; vendor performance reports; contract compliance reviews; incident notification records. | Recommended |
| Exit strategy | Data portability provisions; transition plan; alternative vendor assessment; business continuity procedures. | Recommended |
Using This Checklist
- Items marked Required represent baseline expectations for any AI system audit. Items marked Recommended reflect best practices that strengthen governance posture.
- Evidence types are illustrative. Acceptable evidence formats depend on your organization's documentation standards and the auditor's requirements.
- This checklist aligns with but does not replace requirements from specific frameworks (NIST AI RMF, EU AI Act, ISO 42001). Cross-reference with applicable regulatory requirements.