An aerospace supplier scorecard should include a small set of KPIs across delivery, quality, responsiveness, and business risk. In practice, most scorecards fail because they overemphasize on-time delivery and underweight traceability, repeat escapes, and issue closure behavior.

There is no single universal KPI set for every supplier. The right scorecard depends on what the supplier provides, whether they perform special processes or outside processing, the criticality of the parts, the maturity of your incoming inspection and supplier quality processes, and how reliably your ERP, MES, QMS, and receiving systems share data.

Recommended KPI categories

  • Delivery performance

    • On-time delivery to requested date
    • On-time delivery to promise date
    • Lead time adherence
    • Past due open orders
    • Schedule change responsiveness
    • Short shipment and over shipment rate
  • Quality performance

    • Received defect rate, usually PPM or defect incidents per lot
    • Incoming rejection rate
    • Escape rate, especially defects found after receipt or after use in production
    • Repeat nonconformance rate
    • Severity-weighted nonconformance rate for critical characteristics or regulated processes
    • Rework, scrap, or sorting events attributable to supplier issues
  • Corrective action and responsiveness

    • SCAR or CAPA response timeliness
    • Containment response time
    • Corrective action closure cycle time
    • Overdue action rate
    • Effectiveness of corrective actions, not just closure
  • Documentation and traceability

    • Certificate of conformance completeness and accuracy
    • FAI package completeness where applicable
    • Traceability record completeness by lot, serial, heat, or batch as required
    • Revision mismatch incidents
    • Missing or invalid required documents at receipt
  • Capacity and continuity risk

    • Supplier acknowledgment cycle time
    • Commit reliability over time
    • Single-source dependency exposure
    • Capacity constraint notifications
    • Recovery performance after disruption

What should usually be weighted most heavily

For aerospace, quality and delivery both matter, but weighting should reflect part risk. A machine shop making noncritical hardware should not be scored the same way as a supplier providing flight-critical assemblies, special process output, or serialized components with strict genealogy requirements.

A practical weighting model often looks like this:

  • Quality: 35 to 50 percent
  • Delivery: 25 to 40 percent
  • Responsiveness and corrective action: 10 to 20 percent
  • Documentation and traceability: 10 to 20 percent
  • Commercial factors such as cost variance: limited use unless clearly separated from compliance and quality metrics

If cost is included, keep it separate from compliance and conformance performance. Combining them into one composite number can hide unacceptable quality or traceability behavior.

KPIs that are often missing but matter

  • Defects found after stock acceptance, not just at receiving
  • Repeat issues by failure mode
  • Documentation errors that delay release even when parts are physically usable
  • Supplier-caused production interruptions or line shortages
  • Aging of open supplier nonconformances
  • Performance segmented by commodity, process, site, or program instead of one supplier-wide average

Those metrics matter because supplier averages can look acceptable while one site, process family, or program creates most of the operational risk.

What to avoid

  • Do not rely on a single rolled-up score without the underlying drivers.
  • Do not use only receipt inspection failures if defects are often found later in assembly, test, or FAI.
  • Do not compare suppliers fairly unless date logic, defect counting rules, and document requirements are standardized.
  • Do not punish schedule changes caused by your own planning instability unless the metric explicitly separates buyer-driven versus supplier-driven variance.

Brownfield system reality

In many plants, supplier scorecards are assembled from ERP receipts, QMS nonconformances, spreadsheets, email chasers, and sometimes portal data. That creates predictable problems: conflicting date fields, duplicate defect records, inconsistent supplier naming, and weak linkage between a PO line, received lot, NCR, and corrective action.

So yes, the KPI list matters, but the data model matters just as much. If your systems cannot reliably connect purchase orders, ASNs, receiving events, inspection results, nonconformances, and corrective actions, the scorecard will be argued over instead of used. In regulated environments, that also weakens traceability and change control around supplier performance decisions.

Full replacement of ERP, QMS, or supplier management systems is often not the right answer. In aerospace and other long lifecycle environments, replacement programs commonly stall because of validation effort, qualification burden, integration complexity, downtime risk, and the need to preserve evidence trails across legacy processes. In many cases, a more realistic path is to improve KPI definitions, tighten master data, and add better workflow linkage across existing systems.

Recommended minimum scorecard

If you need a starting point, use a concise scorecard with 8 to 12 measures:

  • On-time delivery to requested date
  • On-time delivery to promise date
  • Past due line count or value
  • Incoming defect rate
  • Escape rate
  • Repeat nonconformance rate
  • SCAR response timeliness
  • Corrective action closure effectiveness
  • Documentation completeness at receipt
  • Traceability record completeness
  • Supplier acknowledgment or commit response time
  • Supplier-caused line disruption events

That is usually enough to distinguish a supplier that is merely shipping parts from one that is consistently supporting a controlled aerospace operation.

The final point is simple: a supplier scorecard should help you make sourcing, development, containment, and escalation decisions. If a KPI does not change behavior or support a defensible action, it probably does not belong on the scorecard.

Get Started

Built for Speed, Trusted by Experts

Whether you're managing 1 site or 100, C-981 adapts to your environment and scales with your needs—without the complexity of traditional systems.