Vendor Review Evidence Checklist
Map Meta, Microsoft SSPA, and Google Workspace security review requirements to SOC 2, ISO 27001, NIST CSF, and CIS Controls with this evidence checklist.
Vendor Review Evidence Checklist
What This Checklist Covers
A cross-reference of security evidence requirements for Meta vendor reviews, Microsoft Supplier Security and Privacy Assurance (SSPA), and Google Workspace security assessments—mapped to SOC 2 Trust Services Criteria, ISO 27001 controls, NIST Cybersecurity Framework functions, and CIS Critical Security Controls.
Use this checklist to identify which existing compliance artifacts satisfy platform review requirements and where gaps exist. Platform reviewers accept evidence that demonstrates control effectiveness, not generic policy statements.
Why Platform Reviews Fail
Platform security reviews fail when organizations submit:
- Policy documents without implementation evidence
- Screenshots without context or validation dates
- Generic compliance statements instead of specific control mappings
- Outdated evidence that doesn’t reflect current state
Reviewers need proof that controls operate as designed: access logs, configuration exports, audit reports, penetration test results, and incident response records with timestamps and ownership.
Evidence Mapping Table
| Platform Requirement | SOC 2 Criteria | ISO 27001 Control | NIST CSF Function | CIS Control | Evidence Type |
|---|---|---|---|---|---|
| Access Control & Identity Management | |||||
| Multi-factor authentication (MFA) enforcement | CC6.1 | A.9.4.2 | PR.AC-7 | 6.3, 6.4 | Configuration export showing MFA enabled for all users; access logs with authentication method |
| Privileged access management | CC6.2, CC6.3 | A.9.2.3 | PR.AC-4 | 5.4, 6.8 | Role assignment matrix; privileged access review logs; session recordings |
| Access review cadence | CC6.1 | A.9.2.5 | PR.AC-4 | 5.4, 6.2 | Quarterly access review reports with approvals; deprovisioning logs |
| Network & Infrastructure Security | |||||
| Network segmentation | CC6.6 | A.13.1.3 | PR.AC-5 | 12.2, 12.3 | Network diagram with VLAN/subnet boundaries; firewall rule exports |
| Intrusion detection/prevention | CC7.2 | A.12.6.1, A.16.1.4 | DE.CM-1 | 13.1, 13.6 | IDS/IPS configuration; alert logs; incident escalation records |
| Vulnerability scanning | CC7.1 | A.12.6.1 | DE.CM-8 | 7.1, 7.5 | Scan reports (internal/external); remediation tracking; retest validation |
| Data Protection | |||||
| Encryption at rest | CC6.1 | A.10.1.1 | PR.DS-1 | 3.11 | Encryption configuration; key management procedures; storage audit logs |
| Encryption in transit | CC6.1, CC6.7 | A.10.1.1, A.13.1.1 | PR.DS-2 | 3.10 | TLS certificate chain; protocol configuration (TLS 1.2+); cipher suite list |
| Data retention & disposal | CC6.5 | A.11.2.7, A.11.2.8 | PR.IP-6 | 3.3 | Retention policy with schedules; disposal logs; sanitization certificates |
| Incident Response & Business Continuity | |||||
| Incident response plan | CC7.3, CC7.4 | A.16.1.1, A.16.1.5 | RS.RP-1 | 17.1 | IR plan with roles/contacts; tabletop exercise after-action reports; incident tickets |
| Backup & recovery procedures | CC9.1 | A.12.3.1, A.17.1.2 | PR.IP-4 | 11.1, 11.3 | Backup configuration; restore test logs; RTO/RPO documentation |
| Business continuity testing | CC9.1 | A.17.1.3 | RC.RP-1 | 11.5 | BCDR test results; failover validation; communication logs |
| Vendor & Third-Party Risk | |||||
| Vendor risk assessments | CC9.2 | A.15.1.1, A.15.1.2 | ID.SC-2 | 15.1, 15.2 | Vendor security questionnaires; risk ratings; contract security clauses |
| Subprocessor due diligence | CC9.2 | A.15.2.1 | ID.SC-3 | 15.1 | Subprocessor list; security reviews; data processing agreements |
| Compliance & Audit | |||||
| SOC 2 Type II report | CC1.1–CC9.2 | Multiple | Multiple | Multiple | Current SOC 2 report (within 12 months); management response to findings |
| Penetration test results | CC7.1 | A.12.6.1 | DE.CM-4 | 18.2 | Pen test report (within 12 months); remediation plan; retest validation letter |
| Security awareness training | CC1.4 | A.7.2.2 | PR.AT-1 | 14.1, 14.2 | Training completion records; phishing simulation results; policy acknowledgments |
How to Use This Checklist
Step 1: Identify Required Evidence
Review the platform’s security questionnaire or assessment framework. Cross-reference each requirement to the table above to determine which evidence artifacts apply.
Step 2: Gather Existing Artifacts
Pull evidence from your compliance program: SOC 2 audit workpapers, ISO 27001 documentation, NIST CSF assessments, or CIS Controls implementation guides. Most platform requirements overlap with framework controls you’ve already implemented.
Step 3: Fill Gaps with New Evidence
Where evidence doesn’t exist, generate it:
- Export configurations from security tools (MFA, firewalls, encryption)
- Run access reviews and document approvals
- Conduct vulnerability scans or penetration tests
- Create network diagrams with current architecture
Step 4: Prepare Submission Package
Organize evidence by platform requirement with:
- Control description: What the control does
- Implementation details: How it’s configured
- Evidence artifact: Screenshot, log export, report, or configuration file
- Validation date: When evidence was captured
- Owner: Who maintains the control
Step 5: Write Clarifications
Platform reviewers often request clarifications. Prepare concise responses that:
- Reference specific evidence artifacts by filename
- Cite framework control IDs (e.g., “This satisfies SOC 2 CC6.1 and ISO 27001 A.9.4.2”)
- Explain any deviations with compensating controls
- Provide remediation timelines for gaps
Common Evidence Gaps
Organizations frequently lack:
- Access review logs with approvals: Reviewers need proof that access was reviewed and approved, not just a list of current users
- Configuration exports with timestamps: Screenshots can be altered; exports from systems provide cryptographic validation
- Penetration test retests: Initial pen test findings must show remediation and retest validation
- Incident response evidence: Tabletop exercises or real incident tickets demonstrate the IR plan works
- Backup restore tests: Backup configurations aren’t enough; reviewers want proof you’ve successfully restored data
Bridge Letters for Automation Tools
If you use compliance automation platforms (Vanta, Secureframe, Drata, Scrut), platform reviewers may not accept tool-generated reports directly. Write a bridge letter that:
- Explains what the automation tool checks
- Maps tool checks to specific platform requirements
- Identifies what the tool cannot validate (e.g., policy effectiveness, incident response execution)
- Describes manual validation procedures for gaps
Example: “Our Vanta continuous monitoring validates MFA enforcement by checking Okta configuration daily. However, Vanta does not validate privileged access reviews. We conduct quarterly manual reviews documented in [evidence artifact].”
Next Steps
Need help preparing evidence? Bell Tower engineers control crosswalks, evidence libraries, and submission packages for Meta, Microsoft SSPA, and Google Workspace reviews.
- Vendor & Platform Security Review Support
- SOC 2 Readiness Assessment
- ISO 27001 Risk Assessment & Certification Support
Frequently Asked Questions
How recent does evidence need to be?
Most platform reviewers require evidence from the past 12 months. Penetration tests, SOC 2 reports, and vulnerability scans older than 12 months will be rejected. Configuration exports and access reviews should reflect current state (within 30-90 days).
Can I reuse SOC 2 evidence for platform reviews?
Yes. SOC 2 Type II reports and audit workpapers contain most evidence platform reviewers need. However, you’ll need to map SOC 2 controls to platform-specific requirements and provide additional artifacts for gaps.
What if my organization doesn’t have SOC 2 or ISO 27001 certification?
You can still pass platform reviews by generating evidence directly. Implement the controls listed in this checklist, document configurations, and produce the required artifacts. Certification is not mandatory, but it streamlines the process.
Do I need to submit all evidence upfront?
No. Most platforms use a phased review process: initial questionnaire, evidence submission, clarifications, and final approval. Submit core evidence (network diagrams, SOC 2 reports, pen test results) first, then respond to specific requests during clarifications.
How long does a platform review take?
Initial reviews typically take 4-8 weeks depending on evidence completeness. Renewals or reassessments (e.g., Microsoft SSPA annual reviews) take 2-4 weeks if evidence is current and well-organized.