AI Governance Maturity Assessment
Purpose
An AI governance maturity assessment tool with level definitions, domain scoring, gap analysis, and improvement roadmap.
Related Controls
1. Assessment Overview
Capture assessment metadata and methodology.
Organization: [ORGANIZATION NAME]
Assessment Date: [DATE]
Assessor: [NAME], [ROLE TITLE]
Assessment Type: Initial / Annual / Triggered
Methodology: Self-assessment against the AI Governance Framework's 6 lifecycle domains and 43 controls, scored against a 5-level maturity model.
Scope: All AI systems, policies, and governance processes at [ORGANIZATION NAME]
Previous Assessment Date: [DATE or N/A]
Previous Overall Score: [SCORE or N/A]
2. Maturity Level Definitions
Define the 5 maturity levels used for scoring.
| Level | Name | Description |
|---|---|---|
| 1 | Initial | AI governance is ad hoc, reactive, and undocumented. No formal policies, processes, or assigned responsibilities. AI adoption occurs without oversight. |
| 2 | Developing | Basic policies exist but are inconsistently applied. Some roles assigned. Governance is project-specific rather than organizational. Limited monitoring and reporting. |
| 3 | Defined | Formal governance framework established with documented policies, roles, and processes. Controls are implemented across most AI systems. Regular reporting and review cycles. |
| 4 | Managed | Governance is measured quantitatively. KPIs/KRIs are tracked and drive decisions. Processes are consistently followed. Continuous monitoring and proactive risk management. |
| 5 | Optimizing | Governance is continuously improved based on lessons learned, industry best practices, and emerging threats. Organization is a leader in AI governance. Innovation is balanced with robust risk management. |
3. Domain Assessment
Score each domain's capabilities against the maturity levels. Provide evidence for each score.
| Domain | Capability | Current Level (1-5) | Evidence | Target Level | Gap |
|---|---|---|---|---|---|
| GOVERN | AI Policy & Strategy | ||||
| Roles & Accountability | |||||
| Risk Management | |||||
| BUILD | Secure Development | ||||
| Data Pipeline Quality | |||||
| Testing & Validation | |||||
| SECURE | Threat Management | ||||
| Agent Security | |||||
| Adversarial Resilience | |||||
| DEPLOY | Release Management | ||||
| Change Control | |||||
| Performance Management | |||||
| MONITOR | Continuous Monitoring | ||||
| Incident Response | |||||
| Compliance Reporting | |||||
| IMPROVE | Lessons Learned | ||||
| Gap Remediation | |||||
| Framework Alignment |
4. Gap Analysis Summary
Summarize the largest gaps between current and target maturity by domain.
| Domain | Current Avg | Target Avg | Gap | Priority | Owner |
|---|---|---|---|---|---|
| GOVERN | Critical / High / Medium / Low | [NAME] | |||
| BUILD | |||||
| SECURE | |||||
| DEPLOY | |||||
| MONITOR | |||||
| IMPROVE |
Gap Prioritization Criteria
- Critical: Gap ≥ 3 levels or regulatory requirement not met
- High: Gap ≥ 2 levels or security control missing
- Medium: Gap = 1 level in important domain
- Low: Gap = 1 level in supporting domain
5. Improvement Roadmap
Define specific initiatives to close maturity gaps over the next 12 months.
| Initiative | Domain | Target Quarter | Owner | Expected Level Change | Status |
|---|---|---|---|---|---|
| [INITIATIVE — e.g., "Formalize AI policy framework"] | GOVERN | Q1 | [NAME] | 1 → 3 | Not Started |
| [INITIATIVE — e.g., "Implement automated security testing"] | SECURE | Q2 | [NAME] | 2 → 3 | Not Started |
| [INITIATIVE — e.g., "Deploy governance dashboard"] | MONITOR | Q2 | [NAME] | 2 → 4 | Not Started |
| [INITIATIVE — e.g., "Establish red team program"] | SECURE | Q3 | [NAME] | 1 → 3 | Not Started |
| [INITIATIVE — e.g., "Implement post-incident review process"] | IMPROVE | Q3 | [NAME] | 2 → 3 | Not Started |
6. Executive Summary
Provide a concise summary for leadership with overall scores and key recommendations.
Overall Maturity Score
Current: [X.X] / 5.0
Target (12-month): [X.X] / 5.0
Previous Assessment: [X.X] / 5.0 (or N/A)
Domain Scores Summary
- GOVERN: [X.X] / 5.0
- BUILD: [X.X] / 5.0
- SECURE: [X.X] / 5.0
- DEPLOY: [X.X] / 5.0
- MONITOR: [X.X] / 5.0
- IMPROVE: [X.X] / 5.0
Top 3 Recommendations
- [RECOMMENDATION] — Priority: [CRITICAL/HIGH] — Expected impact: [DESCRIPTION]
- [RECOMMENDATION] — Priority: [CRITICAL/HIGH] — Expected impact: [DESCRIPTION]
- [RECOMMENDATION] — Priority: [HIGH/MEDIUM] — Expected impact: [DESCRIPTION]
Investment Required
- Staffing: [DESCRIPTION]
- Tooling: [DESCRIPTION]
- Training: [DESCRIPTION]
- External: [DESCRIPTION — consulting, audit, certification]
7. Year-over-Year Comparison
Track maturity progression across annual assessments.
| Domain | [YEAR-2] | [YEAR-1] | [CURRENT YEAR] | Trend |
|---|---|---|---|---|
| GOVERN | ↑ / → / ↓ | |||
| BUILD | ||||
| SECURE | ||||
| DEPLOY | ||||
| MONITOR | ||||
| IMPROVE | ||||
| OVERALL |
Assessment History:
- [DATE]: Initial assessment — Overall: [X.X]
- [DATE]: Annual review — Overall: [X.X]
- [DATE]: Current assessment — Overall: [X.X]
Signed:
- Assessor: [NAME] — [DATE]
- AI Governance Committee Chair: [NAME] — [DATE]