Skip to main content
INS // Insights

Software Supply Chain Security for Government Systems

Updated March 2026 · 11 min read

# Software Supply Chain Security for Government Systems

Software supply chain security isn't theoretical anymore. Executive Order 14028 made it federal policy. The SolarWinds and Log4Shell incidents proved that compromising a single dependency can cascade through thousands of organizations. For government systems, where the stakes include national security, supply chain integrity is as critical as the application code itself.

We build CI/CD pipelines that treat every dependency, container image, and build artifact as a potential attack vector — because they are. This isn't paranoia; it's engineering discipline.

The Attack Surface

Modern software systems are assembled, not written. A typical web application pulls in hundreds of direct dependencies and thousands of transitive dependencies. Each one is a trust decision. Each one could be compromised through:

  • Typosquatting: Malicious packages with names similar to popular libraries
  • Account compromise: Attacker gains control of a maintainer's publishing credentials
  • Build system compromise: Malicious code injected during the build process, not visible in source
  • Dependency confusion: Private package names shadowed by public registry packages

The attack doesn't need to target your code. It targets a dependency three levels deep that your developers never consciously chose to include.

Software Bill of Materials (SBOM)

An SBOM is a complete inventory of every component in your software. Think of it as a nutritional label for software — it tells you exactly what's inside.

For government systems, SBOM generation is increasingly mandated. NTIA minimum elements require:

  • Supplier name
  • Component name and version
  • Unique identifier (CPE, PURL, or SWID)
  • Dependency relationship
  • Author of SBOM data
  • Timestamp

Generating SBOMs in CI/CD

Integrate SBOM generation into every build pipeline. For container images, Syft generates comprehensive SBOMs:

# GitHub Actions: SBOM generation and attestation
name: Build with SBOM
on:
  push:
    branches: [main]

jobs:
  build:
    runs-on: ubuntu-latest
    permissions:
      contents: read
      packages: write
      id-token: write

    steps:
      - uses: actions/checkout@v4

      - name: Build container image
        run: |
          docker build -t ${{ env.REGISTRY }}/${{ env.IMAGE }}:${{ github.sha }} .

      - name: Generate SBOM
        uses: anchore/sbom-action@v0
        with:
          image: ${{ env.REGISTRY }}/${{ env.IMAGE }}:${{ github.sha }}
          format: spdx-json
          output-file: sbom.spdx.json

      - name: Scan SBOM for vulnerabilities
        uses: anchore/scan-action@v4
        with:
          sbom: sbom.spdx.json
          fail-build: true
          severity-cutoff: high

      - name: Attach SBOM to image
        run: |
          cosign attach sbom --sbom sbom.spdx.json \
            ${{ env.REGISTRY }}/${{ env.IMAGE }}:${{ github.sha }}

For application dependencies, generate SBOMs at the language level too:

# Node.js SBOM
npx @cyclonedx/cyclonedx-npm --output-file node-sbom.json

# Python SBOM
cyclonedx-py requirements -i requirements.txt -o python-sbom.json

# Go SBOM
cyclonedx-gomod app -json -output go-sbom.json

SBOM Storage and Querying

Store SBOMs alongside the artifacts they describe. When a new CVE drops, you need to answer "which of our deployed systems use this vulnerable component?" within minutes, not days.

class SBOMRegistry:
    def __init__(self, storage_backend):
        self.storage = storage_backend

    def ingest(self, artifact_id, sbom):
        components = self._parse_components(sbom)
        for component in components:
            self.storage.index(
                artifact_id=artifact_id,
                component_name=component.name,
                component_version=component.version,
                component_purl=component.purl,
                ingested_at=datetime.utcnow()
            )

    def query_by_vulnerability(self, cve_id):
        affected_components = self._lookup_cve_affected(cve_id)
        affected_artifacts = []
        for component in affected_components:
            artifacts = self.storage.find_artifacts_with(
                component.name, component.affected_versions
            )
            affected_artifacts.extend(artifacts)
        return affected_artifacts

Dependency Scanning

SBOM generation tells you what you have. Dependency scanning tells you if what you have is vulnerable.

Multi-Layer Scanning

Scan at every level of the stack:

  • Source dependencies: npm audit, pip-audit, govulncheck during development
  • Container base images: Trivy or Grype scanning against the base image
  • Full container image: Complete vulnerability scan of the assembled image
  • Runtime dependencies: Continuous scanning of deployed images against new CVE data
# Trivy scanning configuration for government standards
trivy:
  severity: CRITICAL,HIGH,MEDIUM
  ignore-unfixed: false
  exit-code: 1
  security-checks:
    - vuln
    - config
    - secret
  vuln-type:
    - os
    - library

Handling False Positives and Exceptions

Not every CVE applies to every usage. A vulnerability in a library's XML parser doesn't affect your application if you never parse XML with that library. But in government systems, you can't just suppress findings without justification.

Implement a formal exception process:

{
  "exceptions": [
    {
      "cve": "CVE-2025-12345",
      "component": "libxml2",
      "justification": "Vulnerable function is not reachable from application code paths. Confirmed via static analysis.",
      "approved_by": "security-lead",
      "approved_date": "2026-02-15",
      "review_date": "2026-05-15",
      "status": "active"
    }
  ]
}

Every exception has an expiration date. Exceptions are reviewed, not granted permanently.

Signed Artifacts with Cosign

Signing artifacts proves they haven't been tampered with since the build system produced them. Cosign (part of the Sigstore project) provides keyless signing using OIDC identity — your CI/CD pipeline's identity becomes the signing credential.

# Keyless signing with Cosign using OIDC identity
cosign sign --yes \
  --rekor-url https://rekor.sigstore.dev \
  $REGISTRY/$IMAGE:$SHA

# Verification at deployment time
cosign verify \
  --certificate-identity-regexp "https://github.com/org/repo" \
  --certificate-oidc-issuer https://token.actions.githubusercontent.com \
  $REGISTRY/$IMAGE:$SHA

This is the same container security approach we implement in production CI/CD pipelines. Every image that reaches production is signed. Every deployment verifies the signature. An unsigned or tampered image cannot deploy.

For air-gapped government environments where keyless signing isn't possible, use key-pair signing with keys stored in AWS KMS:

# Generate key pair in KMS
cosign generate-key-pair --kms awskms:///alias/cosign-signing-key

# Sign with KMS-backed key
cosign sign --key awskms:///alias/cosign-signing-key \
  $REGISTRY/$IMAGE:$SHA

# Verify with public key
cosign verify --key cosign.pub $REGISTRY/$IMAGE:$SHA

SLSA Framework: Supply Chain Levels

SLSA (Supply-chain Levels for Software Artifacts) is a framework for progressively hardening your build process. It defines four levels:

| Level | Requirements |

|-------|-------------|

| SLSA 1 | Build process documented, provenance generated |

| SLSA 2 | Build on hosted platform, signed provenance |

| SLSA 3 | Hardened build platform, non-falsifiable provenance |

| SLSA 4 | Two-person review, hermetic builds, reproducible |

For government systems, SLSA 3 should be the target. This means:

  • Builds run on a managed platform (GitHub Actions, GitLab CI) — not developer laptops
  • Build provenance is generated automatically and signed
  • Build configuration is version-controlled and tamper-resistant
  • Source integrity is verified before every build

Provenance Attestation

Provenance answers: who built this, from what source, using what process, and on what infrastructure? SLSA provenance is a signed attestation that records these facts.

# SLSA provenance generation in GitHub Actions
- name: Generate SLSA provenance
  uses: slsa-framework/slsa-github-generator/.github/workflows/generator_container_slsa3.yml@v2.0.0
  with:
    image: ${{ env.REGISTRY }}/${{ env.IMAGE }}
    digest: ${{ steps.build.outputs.digest }}

The provenance attestation is stored alongside the image and can be verified at deployment time. If the provenance doesn't match expected values — wrong source repository, wrong build platform, unexpected build parameters — deployment is blocked.

Policy Enforcement at Deployment

All the scanning, signing, and attestation is meaningless without enforcement. The deployment pipeline must reject artifacts that don't meet policy.

In Kubernetes environments, admission controllers enforce image policies:

apiVersion: policy.sigstore.dev/v1beta1
kind: ClusterImagePolicy
metadata:
  name: require-signed-images
spec:
  images:
    - glob: "registry.internal/**"
  authorities:
    - keyless:
        url: https://fulcio.sigstore.dev
        identities:
          - issuer: https://token.actions.githubusercontent.com
            subjectRegExp: "https://github.com/org/.*"
    - attestations:
        - name: require-sbom
          predicateType: https://spdx.dev/Document
        - name: require-vuln-scan
          predicateType: https://cosign.sigstore.dev/attestation/vuln/v1
          policy:
            type: cue
            data: |
              predicateType: "https://cosign.sigstore.dev/attestation/vuln/v1"
              predicate: {
                scanner: {
                  result: "PASS"
                }
              }

This policy requires every image to be signed, have an SBOM attestation, and pass vulnerability scanning before it can run in the cluster. No exceptions, no overrides.

Our approach to security automation embeds these controls into every pipeline we build. Supply chain security isn't a separate workstream — it's how we build software.

The zero-trust principles that govern our infrastructure access extend to the software supply chain: trust nothing implicitly, verify everything explicitly, and maintain evidence of every verification for compliance.

Responding to New Vulnerabilities

When a critical CVE drops, the response playbook is:

  • Query SBOM registry for affected artifacts (minutes)
  • Identify deployed instances of affected artifacts (minutes)
  • Assess exploitability in context (hours)
  • Patch, rebuild, re-sign, and redeploy (hours)
  • Verify patched artifacts are deployed across all environments (automated)

The speed of step 1 determines the speed of everything else. Without SBOMs, step 1 takes days of manual investigation.

Frequently Asked Questions

What SBOM format should government systems use?

SPDX and CycloneDX are both acceptable. SPDX is an ISO standard (ISO/IEC 5962:2021) and is more commonly referenced in federal guidance. CycloneDX has richer vulnerability correlation capabilities. Many organizations generate both. For federal compliance, SPDX in JSON format is the safest choice.

How do you handle supply chain security in air-gapped environments?

Mirror approved packages to an internal registry behind the air gap. All packages entering the environment pass through a review process: vulnerability scanning, license compliance checking, and SBOM validation. Container images are built inside the boundary using pre-staged base images and dependencies. Signing uses key-pair (not keyless) with keys managed in a hardware security module.

Does SLSA replace existing security frameworks like FedRAMP?

No. SLSA specifically addresses the build and distribution pipeline. FedRAMP addresses the broader operational security of cloud services. They're complementary. SLSA provenance can serve as evidence for FedRAMP configuration management (CM) controls and system integrity (SI) controls, but it doesn't cover the full FedRAMP control baseline.

What's the performance impact of supply chain security scanning in CI/CD?

SBOM generation adds 30-60 seconds to a container build. Vulnerability scanning adds 1-3 minutes depending on image size. Signing adds 10-15 seconds. For a typical 10-minute build pipeline, supply chain security adds about 20-30% to build time. This is a trivial cost compared to the alternative — deploying a compromised dependency to production.

How often should SBOMs be regenerated?

Generate a new SBOM with every build. SBOMs should be tied to specific artifact versions, not maintained as standalone documents. When you build version 2.4.1 of your application, the SBOM for 2.4.1 is generated during that build and attached to that artifact. Continuous vulnerability scanning runs against stored SBOMs to detect newly disclosed vulnerabilities in already-deployed components.

---

Software supply chain security is how you prove your systems are built from trusted components. Talk to Rutagon about implementing supply chain security for government and defense systems.

Ready to discuss your project?

We deliver production-grade software for government, defense, and commercial clients. Let's talk about what you need.

Initiate Contact