Skip to main content
INS // Insights

Performance-Based Contracting for Government IT

Updated April 2026 · 9 min read

Performance-based contracting is not a new concept in federal acquisition, but its application to IT and software programs has evolved significantly as agencies have moved toward agile delivery models and cloud-native architectures. Understanding how performance-based service acquisition (PBSA) works — and specifically what it requires from a technology delivery standpoint — helps small business IT companies demonstrate value in ways that matter to government evaluators and award fee panels.

What Performance-Based Contracting Is

Federal Acquisition Regulation (FAR) Part 37.6 establishes performance-based acquisition as the preferred method for acquiring services. The core principle is straightforward: the government specifies desired outcomes and performance standards rather than prescribing how the work must be done.

For IT programs, this means the government states:

  • What functions the system must perform
  • What reliability and performance targets the system must meet
  • What quality standards apply to delivered software
  • How performance will be measured

And leaves to the contractor:

  • How to architect the system
  • Which technologies and tools to use
  • How to staff and organize the delivery team
  • What methodology and processes to follow

This is how modern software development should work. A government IT program that specifies "you must use Java 8 and Oracle Database" rather than "the system must process 500 transactions per second with 99.9% availability" is prescribing implementation rather than outcomes — the opposite of PBSA intent.

Performance Work Statement (PWS) Structure

The Performance Work Statement is the central document in a PBSA contract. Unlike a traditional Statement of Work (SOW), which describes tasks to be performed, a PWS describes results to be achieved.

A well-constructed PWS for an IT program includes:

Background and scope: What mission the system supports, who the users are, and the operational context.

Required outcomes: What the system must do, expressed as measurable results. "Provide a web-based interface for authorized users to submit, track, and approve requests" rather than "develop a web application."

Performance standards: Specific, measurable thresholds for each required outcome. For an IT system:

  • Availability: 99.9% during core business hours
  • Response time: 95th percentile under 2 seconds for standard queries
  • Processing timeliness: 99% of submitted requests processed within 15 minutes
  • Defect rate: No more than 1 critical defect per release; no more than 5 high-severity defects per release

Acceptable Quality Levels (AQLs): The minimum acceptable performance for each standard, and the monitoring frequency and method.

Quality Assurance Surveillance Plan (QASP): The government's plan for monitoring contractor performance against the standards. For IT programs, this typically includes automated monitoring dashboards, periodic government testing, and user satisfaction surveys.

Award Fee and Incentive Fee Structures

Many government IT contracts use award fee or incentive fee mechanisms to align contractor financial incentives with performance.

Award fee: A pool of fee (typically 5-15% of contract value) distributed based on a periodic evaluation of performance. An Award Fee Board evaluates contractor performance against defined criteria and determines the percentage of the available fee earned for the period. Fee determinations are made by the Contracting Officer based on the board's recommendation.

Incentive fee: A formula-based fee tied to specific performance metrics. If the contractor meets the target, they earn the fee. If they exceed the target, they may earn additional fee. If they miss the target, fee is reduced. Unlike award fee, incentive fee is formula-driven — there is no evaluator discretion.

For IT programs, common award fee criteria include:

  • System availability above the AQL threshold
  • On-time delivery of committed features/releases
  • Defect density (defects per KLOC or per release)
  • User satisfaction scores
  • Response quality and timeliness to government requests
  • Security posture (vulnerability remediation SLAs met, zero high findings open past SLA)

Understanding how award fee criteria are structured — and what they actually measure — tells a technology subcontractor exactly where to focus quality investment. A prime who earns the maximum award fee on an IT program consistently demonstrates this through subcontractor delivery.

Quality Assurance Surveillance Plan (QASP) Implications

The QASP defines how the government will observe, test, and verify that performance standards are being met. For technology subs, the QASP determines what operational data the government will examine:

Automated monitoring data: Uptime dashboards, latency metrics, error rates. The government COR (Contracting Officer's Representative) typically has read access to the monitoring dashboard and can pull data on demand. This is not a quarterly report — it is continuous visibility.

Periodic testing: Some QASPs require government-directed testing at specified intervals — penetration tests, load tests, accessibility scans. Technology subs should expect their systems to be tested by third parties on the government's behalf and build systems that pass these tests routinely.

User satisfaction surveys: Agencies often survey end users periodically. Low satisfaction scores affect award fee determinations even when technical metrics are meeting AQLs.

Incident reports: Every service-affecting incident that falls within the performance standard scope must be reported. For FISMA systems, incidents that affect availability are typically reportable within defined timeframes.

The QASP creates transparency obligations. Systems that look fine in static reports but degrade under real user load, accumulate technical debt silently, or require undisclosed maintenance windows will eventually surface in QASP monitoring. Build systems designed to be monitored — not just systems that work when nobody's looking.

How Technology Subs Deliver in a PBSA Environment

When a small business IT company is subcontracting under a PBSA prime contract, the prime's award fee determination is directly affected by sub performance. Primes that choose subs based on lowest cost only — without verifying delivery capability — risk burning award fee when subs miss AQLs.

What primes need from technology subs in PBSA environments:

Operational metrics you can provide: Can you give the prime access to availability dashboards that demonstrate AQL compliance? Can you generate automated reports of system performance metrics suitable for the QASP? Running a production system at scale — a content platform serving millions of monthly views, for example — provides demonstrated experience with these operational requirements.

Proactive incident reporting: PBSA contracts require prompt incident reporting. Technology subs who surface incidents proactively to the prime — before the government discovers them through monitoring — are far more valuable than those who minimize or delay disclosure.

Defect management discipline: PBSA award fee criteria typically include defect density and severity. Subs that manage a clear defect backlog with severity classification, track remediation timelines, and provide this data to the prime enable the prime to demonstrate QASP compliance.

Deployment predictability: Award fee periods are typically 6 or 12 months. A technology sub that delivers reliably every sprint over a 6-month period enables the prime to make concrete commitments in award fee self-assessments. A sub with erratic delivery creates uncertainty that prime program managers can't manage.

These delivery characteristics align with what Rutagon brings to government programs — the track record covered in our delivery model for federal IT and the continuous monitoring architecture we implement as a standard deliverable.

Frequently Asked Questions

What is the difference between a PWS and a SOW for government IT contracts?

A Statement of Work (SOW) describes tasks and activities — what the contractor will do, step by step. A Performance Work Statement (PWS) describes required outcomes and performance standards — what the system must achieve. A SOW might say "the contractor will conduct weekly status meetings." A PWS might say "the contractor will maintain 99.9% system availability and report all incidents within 2 hours." FAR 37.6 establishes PBSA (including PWS) as the preferred approach for service acquisitions, but many agencies still use SOWs, particularly for development contracts where the deliverables are concrete (software systems, documents, analyses).

How is award fee allocated across a subcontracting chain?

Award fee is typically earned by the prime contractor — it is part of the prime's contract with the government. How the prime shares that fee with subcontractors is a separate commercial arrangement between the prime and sub. Some primes incorporate performance-based fee arrangements into subcontracts that mirror the government's award fee criteria. Others simply use the award fee as a margin management mechanism. If you're negotiating as a sub, ask specifically whether there is an award fee flow-down and what performance metrics govern it.

What are typical AQL thresholds for government IT availability?

Typical Acceptable Quality Levels (AQLs) for government IT availability fall in the 99.0-99.9% range for mission-important systems, with 99.5% being common for FISMA Moderate impact systems. High-impact systems or mission-critical operational systems may require 99.95%+. The AQL is typically lower than the target — for example, the SLO target might be 99.9% availability, but the AQL (below which the contractor is in breach) might be 99.5%. This creates a buffer between engineering targets and contractual minimum performance.

How does PBSA interact with CMMI or DCSA security compliance requirements?

PBSA governs the service delivery structure and performance measurement. Compliance frameworks like CMMI (capability maturity), CMMC (cybersecurity), and RMF/ATO (security authorization) are separate requirements imposed by the nature of the program. A PBSA contract can also include CMMC compliance as a performance standard — for example, "zero critical open vulnerabilities past their remediation SLA" or "100% of CMMC Level 2 controls implemented and evidenced." When these requirements appear in the PWS as performance standards with AQLs, they become award fee criteria — creating financial consequences for compliance failures that go beyond the typical compliance audit.

Can a small business IT company prime a PBSA government contract?

Yes — small businesses prime PBSA contracts regularly, particularly at ACAT III levels and on IDIQ task orders under established small business vehicles (8(a), STARS III, Polaris, etc.). The key requirement is that the prime can credibly demonstrate capacity to meet the performance standards — both the technical delivery capability and the program management infrastructure to monitor and report performance. Small businesses with genuine operational track records (production systems, real customers, measurable performance histories) are better positioned than those whose experience is entirely in government contract execution.

Rutagon partners with primes to deliver performance-based government IT →

Discuss your project with Rutagon

Contact Us →

Ready to discuss your project?

We deliver production-grade software for government, defense, and commercial clients. Let's talk about what you need.

Initiate Contact