Vendor/Model Evaluation Rubric

Tier 2 GOVERN

What This Requires

Create standardized evaluation criteria for AI vendors and foundation models covering security, privacy, performance, compliance, and support. Require scoring and approval before procurement or integration.

Why It Matters

Ad-hoc vendor selection leads to fragmented tooling, security gaps, and compliance violations. A rubric ensures consistent due diligence and enables comparison across alternatives.

How To Implement

Define Evaluation Criteria

Create scorecard with weighted categories: Security (SOC2, data residency, encryption), Privacy (GDPR compliance, data retention), Performance (latency, uptime SLA), Compliance (certifications, audit rights), Support (SLA, documentation). Assign points (0-5) and weights per category.

Set Approval Thresholds

Define minimum passing score (e.g., 70/100) and category minimums (must score ≥3 in Security and Privacy). Require executive approval for scores below threshold if business justification is strong.

Documentation Requirements

For each vendor, collect: security questionnaire, contract redlines, data flow diagram, compliance attestations. Store in central repository (SharePoint, Confluence).

Continuous Monitoring

Re-evaluate annually or upon contract renewal. Track vendor incidents (breaches, outages) and adjust scores accordingly.

Evidence & Audit

  • Vendor evaluation rubric document with scoring methodology
  • Completed scorecards for all active AI vendors
  • Vendor security questionnaires and attestations
  • Approval records tied to evaluation scores
  • Re-evaluation schedule and completion records

Related Controls