The System Security Plan (SSP) is the primary documentation artifact for NIST Risk Management Framework (RMF) authorization. It describes every security control applicable to a system, how each control is implemented, and the evidence that the implementation exists and works.
The traditional SSP is a Word document or Excel spreadsheet. It's manually maintained. It drifts from the actual system state within weeks of being written. By the time an assessment begins, program teams spend days or weeks reconciling what the SSP says with what the system actually does.
This is a documentation problem with an engineering solution: treat the SSP as a code artifact that stays synchronized with the infrastructure it describes.
Why SSPs Get Stale
The core problem is that SSPs are written separately from the system they document. A security engineer writes the SSP based on their understanding of the architecture. An infrastructure engineer deploys the actual system. The two artifacts are manually synchronized — which means they're synchronized exactly once, at SSP submission, and then gradually diverge.
Change events that create SSP drift:
- Infrastructure changes deployed through IaC that aren't reflected in the SSP
- Configuration changes applied through the console rather than through managed code
- New services added to the architecture without updating the boundary diagram
- Control implementations changed without updating the SSP narrative
For CMMC assessments, this drift means the package that assessors review doesn't match the environment they'll assess. The resulting gap findings require remediation on a compressed timeline — pressure that produces worse security outcomes, not better ones.
The Code-First SSP Architecture
The solution is inverting the relationship: instead of the SSP driving the infrastructure design, the infrastructure code drives the SSP documentation.
Infrastructure as Code (Terraform, CloudFormation) is the source of truth for what controls are implemented and how. The SSP generation pipeline reads the actual infrastructure state and produces documentation that reflects reality.
# SSP generation pipeline — reads infrastructure state, generates control narratives
import boto3
import json
from jinja2 import Template
from dataclasses import dataclass
from typing import Dict, List
@dataclass
class ControlImplementation:
control_id: str
status: str # "implemented", "partially_implemented", "planned", "not_applicable"
implementation_narrative: str
responsible_roles: List[str]
evidence_locations: List[str]
class SSPGenerator:
def __init__(self, terraform_state_bucket: str, control_templates_path: str):
self.s3 = boto3.client('s3', region_name='us-gov-west-1')
self.tfstate_bucket = terraform_state_bucket
self.templates_path = control_templates_path
def get_infrastructure_state(self) -> dict:
"""Read actual deployed infrastructure state from Terraform"""
response = self.s3.get_object(
Bucket=self.tfstate_bucket,
Key='terraform.tfstate'
)
return json.loads(response['Body'].read())
def extract_control_evidence(self, tf_state: dict) -> Dict[str, dict]:
"""
Extract evidence of control implementations from infrastructure state.
Maps Terraform resources to NIST 800-171 control families.
"""
evidence = {}
resources = tf_state.get('resources', [])
for resource in resources:
# AC controls: IAM resources, VPC security groups
if resource['type'] in ['aws_iam_policy', 'aws_iam_role']:
evidence.setdefault('AC', []).append({
'resource': resource['name'],
'type': resource['type'],
'attributes': resource['instances'][0]['attributes']
})
# AU controls: CloudTrail, CloudWatch Logs
if resource['type'] == 'aws_cloudtrail':
evidence.setdefault('AU', []).append({
'resource': resource['name'],
'is_multi_region': resource['instances'][0]['attributes'].get('is_multi_region_trail'),
'log_file_validation': resource['instances'][0]['attributes'].get('enable_log_file_validation')
})
# SC controls: KMS, VPC endpoints, TLS configuration
if resource['type'] == 'aws_kms_key':
evidence.setdefault('SC', []).append({
'resource': resource['name'],
'key_rotation': resource['instances'][0]['attributes'].get('enable_key_rotation')
})
return evidence
def generate_control_narrative(self, control_id: str, evidence: dict) -> ControlImplementation:
"""Generate control narrative from template and actual evidence"""
template_path = f"{self.templates_path}/{control_id}.j2"
with open(template_path) as f:
template = Template(f.read())
narrative = template.render(evidence=evidence, control_id=control_id)
return ControlImplementation(
control_id=control_id,
status="implemented" if evidence else "planned",
implementation_narrative=narrative,
responsible_roles=self._get_responsible_roles(control_id),
evidence_locations=self._get_evidence_locations(control_id, evidence)
) Jinja2 Templates for Control Narratives
Control narrative templates use the infrastructure evidence to produce accurate, specific documentation rather than generic boilerplate:
{# Template for NIST 800-171 3.1.1 — Limit system access to authorized users #}
## 3.1.1 — Limit System Access to Authorized Users
**Status:** Implemented
### Implementation Narrative
Access to the covered contractor information system is controlled through AWS Identity and Access Management (IAM).
The following IAM resources enforce access control:
{% for role in evidence.AC.roles %}
- **{{ role.resource }}**: Role with permissions scoped to least privilege:
- Attached policies: {{ role.attributes.get('managed_policy_arns', []) | join(', ') }}
- Trust relationship restricted to: {{ role.attributes.trust_policy.Principal }}
{% endfor %}
Multi-factor authentication is enforced for console access through IAM Identity Center with the following configuration:
- MFA required for all console sign-ins
- Session duration limited to {{ evidence.session_duration_hours }} hours
Human access is federated through {{ evidence.identity_provider }} — no direct IAM user credentials are issued to individuals.
### Evidence Locations
{% for location in evidence_locations %}
- {{ location }}
{% endfor %}
### Responsible Roles
System Owner, System Administrator This template produces a control narrative that reflects the actual IAM configuration deployed in Terraform — if the session duration changes in Terraform, the next SSP generation run reflects the new value.
CI/CD Integration: SSP Regeneration on Infrastructure Change
The SSP regeneration pipeline runs as part of the infrastructure CI/CD pipeline — every infrastructure change triggers an SSP update:
# GitLab CI stage for SSP regeneration
generate-ssp:
stage: documentation
image: python:3.11-slim
script:
- pip install boto3 jinja2 docx-python
- python scripts/generate_ssp.py
--state-bucket $TF_STATE_BUCKET
--templates-path ./ssp-templates
--output-path ./artifacts/ssp
- python scripts/generate_ssp_docx.py # Generate Word format for AO submission
artifacts:
paths:
- artifacts/ssp/
expire_in: 90 days
rules:
- if: $CI_COMMIT_BRANCH == "main"
changes:
- terraform/**/*
- infrastructure/**/* The pipeline artifact is an SSP document that is timestamped, version-controlled (GitLab pipeline artifact), and synchronized with the infrastructure state at that moment. The SSP and the infrastructure diverge only if someone makes a change outside the pipeline — which the ITAR and access controls are designed to prevent.
OSCAL: The Machine-Readable SSP Standard
NIST's Open Security Controls Assessment Language (OSCAL) is the emerging standard for machine-readable SSPs and control catalogs. OSCAL-formatted SSPs can be consumed by tools, shared between systems, and validated automatically.
The DoD and FedRAMP are both moving toward OSCAL-based packages. Building SSP automation on OSCAL JSON/YAML from the start positions a program for the tools ecosystem that's developing around it:
{
"system-security-plan": {
"uuid": "a1b2c3d4-...",
"metadata": {
"title": "System Security Plan — [System Name]",
"last-modified": "2026-03-21T00:00:00Z",
"version": "1.4"
},
"control-implementation": {
"implemented-requirements": [
{
"uuid": "...",
"control-id": "ac-1",
"description": "Access control policy implemented through AWS IAM...",
"by-components": [
{
"component-uuid": "...",
"description": "IAM roles scoped to least privilege per control requirement"
}
]
}
]
}
}
} What This Delivers for CMMC and ATO
For CMMC Level 2 assessors, an SSP that accurately reflects the system is a dramatically better starting point than a stale document that requires reconciliation. The assessment can focus on verifying implementation quality rather than correcting documentation errors.
For continuous ATO, an SSP that updates with every infrastructure change is a prerequisite. The AO needs visibility into the current control implementation state — not the state it was in at the last annual review. Code-driven SSP generation provides that visibility.
For more on the broader continuous authorization architecture, see our guides on continuous ATO automation and CMMC Level 2 cloud architecture.
Discuss SSP automation and compliance documentation → rutagon.com/government
Frequently Asked Questions
What is a System Security Plan (SSP)?
An SSP is the primary documentation artifact for NIST Risk Management Framework authorization. It describes the system boundary, all applicable security controls, how each control is implemented, and the responsible parties. Every system seeking an ATO must have an approved SSP.
Is an SSP required for CMMC Level 2?
Yes. CMMC Level 2 requires a complete SSP addressing all 110 NIST 800-171 requirements. The SSP is reviewed by the C3PAO assessor as part of the assessment process. An incomplete or inaccurate SSP is a finding that must be corrected before authorization.
What is OSCAL and why does it matter for SSPs?
OSCAL (Open Security Controls Assessment Language) is NIST's XML/JSON schema for machine-readable security documentation — control catalogs, profiles, SSPs, and assessment results. Both FedRAMP and DoD are moving toward OSCAL-formatted packages. Building compliance tooling on OSCAL enables automation and toolchain interoperability that Word documents can't support.
How do we keep the SSP accurate as the system changes?
The only reliable approach is automated SSP generation from the infrastructure state — not manual maintenance. When infrastructure changes are deployed through code (IaC), the SSP generation pipeline runs automatically and produces updated documentation. Manual changes to the infrastructure outside the pipeline break this synchronization and must be governed through access controls.
What's the POAM and how does it relate to the SSP?
The Plan of Action and Milestones (POA&M) documents controls that are not fully implemented, with planned remediation dates. The POA&M is the companion document to the SSP — the SSP documents what's implemented, the POA&M documents what's not. Both are required for ATO. Automated tracking systems that compare infrastructure state against the control baseline can generate POA&M entries automatically for controls with open findings.
Discuss your project with Rutagon
Contact Us →