AI Governance Maturity Assessment

Assessment IMPROVE

Purpose

An AI governance maturity assessment tool with level definitions, domain scoring, gap analysis, and improvement roadmap.

Related Controls

ISO Clause 10 NIST GV-4

1. Assessment Overview

Capture assessment metadata and methodology.

Organization: [ORGANIZATION NAME]

Assessment Date: [DATE]

Assessor: [NAME], [ROLE TITLE]

Assessment Type: Initial / Annual / Triggered

Methodology: Self-assessment against the AI Governance Framework's 6 lifecycle domains and 43 controls, scored against a 5-level maturity model.

Scope: All AI systems, policies, and governance processes at [ORGANIZATION NAME]

Previous Assessment Date: [DATE or N/A]

Previous Overall Score: [SCORE or N/A]

2. Maturity Level Definitions

Define the 5 maturity levels used for scoring.

LevelNameDescription
1InitialAI governance is ad hoc, reactive, and undocumented. No formal policies, processes, or assigned responsibilities. AI adoption occurs without oversight.
2DevelopingBasic policies exist but are inconsistently applied. Some roles assigned. Governance is project-specific rather than organizational. Limited monitoring and reporting.
3DefinedFormal governance framework established with documented policies, roles, and processes. Controls are implemented across most AI systems. Regular reporting and review cycles.
4ManagedGovernance is measured quantitatively. KPIs/KRIs are tracked and drive decisions. Processes are consistently followed. Continuous monitoring and proactive risk management.
5OptimizingGovernance is continuously improved based on lessons learned, industry best practices, and emerging threats. Organization is a leader in AI governance. Innovation is balanced with robust risk management.

3. Domain Assessment

Score each domain's capabilities against the maturity levels. Provide evidence for each score.

DomainCapabilityCurrent Level (1-5)EvidenceTarget LevelGap
GOVERNAI Policy & Strategy
Roles & Accountability
Risk Management
BUILDSecure Development
Data Pipeline Quality
Testing & Validation
SECUREThreat Management
Agent Security
Adversarial Resilience
DEPLOYRelease Management
Change Control
Performance Management
MONITORContinuous Monitoring
Incident Response
Compliance Reporting
IMPROVELessons Learned
Gap Remediation
Framework Alignment

4. Gap Analysis Summary

Summarize the largest gaps between current and target maturity by domain.

DomainCurrent AvgTarget AvgGapPriorityOwner
GOVERNCritical / High / Medium / Low[NAME]
BUILD
SECURE
DEPLOY
MONITOR
IMPROVE

Gap Prioritization Criteria

  • Critical: Gap ≥ 3 levels or regulatory requirement not met
  • High: Gap ≥ 2 levels or security control missing
  • Medium: Gap = 1 level in important domain
  • Low: Gap = 1 level in supporting domain

5. Improvement Roadmap

Define specific initiatives to close maturity gaps over the next 12 months.

InitiativeDomainTarget QuarterOwnerExpected Level ChangeStatus
[INITIATIVE — e.g., "Formalize AI policy framework"]GOVERNQ1[NAME]1 → 3Not Started
[INITIATIVE — e.g., "Implement automated security testing"]SECUREQ2[NAME]2 → 3Not Started
[INITIATIVE — e.g., "Deploy governance dashboard"]MONITORQ2[NAME]2 → 4Not Started
[INITIATIVE — e.g., "Establish red team program"]SECUREQ3[NAME]1 → 3Not Started
[INITIATIVE — e.g., "Implement post-incident review process"]IMPROVEQ3[NAME]2 → 3Not Started

6. Executive Summary

Provide a concise summary for leadership with overall scores and key recommendations.

Overall Maturity Score

Current: [X.X] / 5.0

Target (12-month): [X.X] / 5.0

Previous Assessment: [X.X] / 5.0 (or N/A)

Domain Scores Summary

  • GOVERN: [X.X] / 5.0
  • BUILD: [X.X] / 5.0
  • SECURE: [X.X] / 5.0
  • DEPLOY: [X.X] / 5.0
  • MONITOR: [X.X] / 5.0
  • IMPROVE: [X.X] / 5.0

Top 3 Recommendations

  1. [RECOMMENDATION] — Priority: [CRITICAL/HIGH] — Expected impact: [DESCRIPTION]
  2. [RECOMMENDATION] — Priority: [CRITICAL/HIGH] — Expected impact: [DESCRIPTION]
  3. [RECOMMENDATION] — Priority: [HIGH/MEDIUM] — Expected impact: [DESCRIPTION]

Investment Required

  • Staffing: [DESCRIPTION]
  • Tooling: [DESCRIPTION]
  • Training: [DESCRIPTION]
  • External: [DESCRIPTION — consulting, audit, certification]

7. Year-over-Year Comparison

Track maturity progression across annual assessments.

Domain[YEAR-2][YEAR-1][CURRENT YEAR]Trend
GOVERN↑ / → / ↓
BUILD
SECURE
DEPLOY
MONITOR
IMPROVE
OVERALL

Assessment History:

  • [DATE]: Initial assessment — Overall: [X.X]
  • [DATE]: Annual review — Overall: [X.X]
  • [DATE]: Current assessment — Overall: [X.X]

Signed:

  • Assessor: [NAME] — [DATE]
  • AI Governance Committee Chair: [NAME] — [DATE]
← Back to all templates