Yes, in most environments you can define aerospace- or MRO-specific KPIs alongside ISO 22400 metrics, but it is never “just add a field.” You need to design how those KPIs coexist with the standard model, how they are calculated, and how they are governed across systems.
What “alongside ISO 22400” usually means
In a regulated aerospace or MRO context, “alongside ISO 22400” typically means:
- Keeping ISO 22400 metrics (e.g., OEE-related KPIs) as a stable, reference baseline.
- Layering additional KPIs that are specific to aerospace or MRO, such as turnaround-time buckets, maintenance lineage adherence, concession rate, or AOG-related measures.
- Ensuring the new KPIs do not alter the definition or calculation of the ISO 22400 indicators without explicit change control.
Most MES, data historians, or analytics layers can support this, but they vary widely in how cleanly they handle custom KPIs and hierarchies.
Typical aerospace and MRO custom KPIs
Common examples that organizations add on top of ISO 22400 include:
- MRO turnaround metrics: TAT by tail/serial, by check type, by customer, or by station; proportion of work completed on first scheduled slot.
- Maintenance lineage and configuration KPIs: percentage of tasks with fully documented part lineage; defect recurrence on same tail/assembly; repairs without full trace to prior event or SB/AD.
- Scrap, rework, and concession KPIs: MRO scrap and waste by assembly or ATA chapter; percentage of jobs requiring deviation/repair dispositions; cost of poor quality specific to rework in shop visit.
- Fleet- and safety-focused metrics: repeat discrepancies on same tail within X cycles; findings per flight hour; ratio of unscheduled vs scheduled findings.
- Contractual and AOG metrics: on-time release vs contract TAT; jobs driving AOG exposure; hours in AOG state by root cause category.
These can coexist with ISO 22400 metrics if the data model and calculation rules are clearly separated and traceable.
Key design constraints and tradeoffs
When adding custom KPIs, the most important design questions are:
- Scope vs standardization
If every site or program defines its own MRO KPIs, cross-site comparison becomes impossible. If you over-standardize, you may lose program- or platform-specific insight. Most organizations end up with a small “global” set plus limited local extensions.
- Data source alignment
ISO 22400 metrics might be calculated primarily from MES or machine states, while aerospace/MRO KPIs often require combining MES, ERP, MRO software, and sometimes PLM or fleet systems. Poor data integration leads to inconsistent numbers and erosion of trust in the KPIs.
- Impact on OEE/OLE and capacity metrics
If you redefine availability, performance, or quality to bake in fleet or MRO concepts (e.g., AOG penalties) you may break comparability with ISO 22400 baselines. A safer approach is to keep ISO 22400 calculations intact and create related, but distinct, derived KPIs.
- Overhead vs insight
Every additional KPI adds work: data quality monitoring, explanations for auditors, change control, and retraining. A narrowed set of high-value KPIs is usually more sustainable than dozens of overlapping metrics.
Brownfield and system coexistence considerations
In brownfield aerospace and MRO environments, KPIs are rarely controlled by a single system. You typically have a mix of:
- Legacy MES or homegrown execution tools.
- Aviation MRO systems or customized ERP modules.
- Separate quality / NCR / CAPA and maintenance lineage tools.
- Fleet or airline systems with actual flight and utilization data.
Because of that:
- Full replacement is uncommon: Ripping out legacy MES or MRO systems just to “standardize metrics” is rarely feasible given validation cost, downtime risk, and the effort to requalify processes and integrations.
- Analytics or data hub layer is typical: Many organizations implement KPIs in an analytics or data warehouse layer that sits over MES/ERP/MRO, rather than inside each operational system. ISO 22400 metrics and custom aerospace/MRO KPIs are then calculated from harmonized data models.
- Multiple truths risk: If individual sites calculate KPIs locally in spreadsheets or custom reports while corporate analytics uses a central model, you can end up with conflicting values for “the same” indicator. Alignment on a single source of calculation logic is critical.
Traceability, validation, and change control
In regulated environments, adding or changing KPIs is not just a reporting decision. You should expect at least:
- Documented definitions for every KPI: purpose, formula, time bases, data sources, and inclusion/exclusion rules.
- Version control for KPI logic: when a formula changes, you need to be able to explain historical vs current calculations.
- Validation and verification of calculations: spot checks against raw transaction and event data, especially for metrics that drive capacity, cost, or safety-related decisions.
- Change control across systems: if you adjust a KPI definition in analytics, associated reports and operational dashboards must be updated in sync, with appropriate approvals.
- Audit trail for KPI usage: who accessed which KPI reports, and how those metrics feed management review, NCR, or CAPA decisions.
Practical implementation approach
A pragmatic pattern for adding aerospace/MRO KPIs alongside ISO 22400 is:
- Lock down ISO 22400 definitions as your baseline set. Document them and map each one to concrete system fields and states.
- Identify a small set of aerospace/MRO essentials that you cannot get from ISO 22400 alone (e.g., TAT, maintenance lineage completeness, MRO scrap and waste, AOG exposure).
- Design a harmonized data model that can calculate both ISO 22400 metrics and the new KPIs from the same underlying event, routing, and quality data where possible.
- Prototype in a non-production analytics environment and reconcile results against existing site reports to catch integration and definition mismatches.
- Run change control: approve, document, and train users on the new KPI set; update procedures for performance reviews, problem solving, and management review.
- Iterate slowly: retire unused KPIs and refine definitions rather than continuously adding more metrics.
Linking back to ISO 22400
As long as you preserve the standard ISO 22400 metrics as a stable baseline, you can add aerospace/MRO KPIs on top, referencing them as derivatives or complements rather than replacements. This allows you to keep comparability across plants and vendors while still capturing aerospace-specific realities like turnaround, maintenance lineage, and MRO-specific scrap and waste.