FAQ

Who should own and govern manufacturing KPI definitions in a multi-plant organization?

In a multi-plant, regulated manufacturing environment, no single function should unilaterally own manufacturing KPI definitions. Ownership and governance should sit with a cross-functional KPI governance group chartered by operations leadership, with clear accountabilities and formal change control.

Preferred ownership model

A practical and defensible model is:

  • Executive sponsor: VP/Head of Operations (or equivalent) owns the overall KPI framework, approves major changes, and arbitrates conflicts between sites or functions.
  • KPI governance group (core ownership): A standing cross-functional team responsible for defining, documenting, and changing KPI definitions. As a minimum, include representatives from:
    • Operations / manufacturing engineering (process and performance owners)
    • Quality (to align with QMS, CAPA, and audit expectations)
    • Finance / controlling (to align with financial reporting where relevant)
    • IT/OT or digital manufacturing (for data sources, system constraints, and validation)
    • At least 2–3 plants (to represent different product lines, asset ages, and realities)
  • Plant management: Owns application of the standard KPIs locally, and may define additional local KPIs provided they do not change or obscure corporate definitions.

This structure keeps definitions consistent across plants while ensuring they are grounded in real operations, quality, and system capabilities.

What this group should own

The KPI governance group should have explicit ownership of:

  • Canonical KPI catalog: A controlled list of “official” manufacturing KPIs used for cross-site comparison (for example OEE, NPT, yield, scrap, rework rate, schedule adherence, on-time delivery to commit).
  • Exact definitions and formulas: For each KPI, clearly defined:
    • Purpose and scope (e.g., production vs. maintenance vs. quality)
    • Formula and units, including time base and aggregation rules
    • Inclusions and exclusions (for example, what counts as planned vs. unplanned downtime, what events are excluded as force majeure)
    • Data source systems and primary data owners
    • Known limitations (for example, legacy lines where certain events are not captured automatically)
  • Data lineage and traceability: Documented mapping from raw source data to KPI, including transforms, filters, and any manual adjustments, to support audits and investigations.
  • Governance processes: How KPIs are proposed, reviewed, approved, versioned, retired, and communicated.
  • Validation expectations: For regulated environments, what level of verification or validation is required when KPI logic or underlying systems change.

Why not let each plant own its own definitions?

Letting each site define KPIs independently often results in:

  • Non-comparable metrics: Plants may all report “OEE” or “on-time delivery” but use different formulas, time bases, or exclusions, making corporate rollups and benchmarking misleading.
  • Disputes in reviews: Leadership challenges the numbers, and time is spent reconciling definitions instead of addressing performance.
  • Audit and investigation risk: When incidents, customer complaints, or regulator questions arise, it is difficult to show consistent, traceable performance history across plants.
  • Integration churn: MES/ERP/BI teams continually adapt reports for each plant’s variant of “standard” KPIs, increasing cost and defect risk.

Individual plants should still have freedom to manage their local operations with additional KPIs, but corporate KPIs used for comparison and decision-making must have centrally governed definitions.

Role of IT/OT and analytics teams

IT/OT, data engineering, and analytics teams should not own KPI definitions in isolation, but they are essential partners:

  • Custodians of implementation: They implement the KPI logic in MES, historians, data platforms, and BI tools according to the approved definitions.
  • Feasibility checks: They advise on what is achievable with existing systems, data quality, and network constraints, and highlight where definitions need adjustment.
  • Change and validation support: They support impact analysis, testing, and validation when KPI definitions or source systems change.

Formal linkage to change management (for example via ITIL, CSV, or internal validation procedures) is important. KPI logic changes can alter reported performance and must not be silently deployed.

Handling brownfield and multi-system realities

In a typical brownfield landscape with multiple MES, historians, and manual data capture methods, a few practical rules help:

  • Central definition, localized implementation: Keep the KPI definition and intent consistent, but allow site-specific implementation notes where systems differ (for example, how “machine state” is inferred on older equipment).
  • Document exceptions: Where a plant cannot fully meet the standard definition due to system or sensor gaps, record the deviation explicitly and flag it on reports.
  • Avoid defining KPIs around one vendor’s tool: Define KPIs conceptually and formally first, then map to specific MES/ERP/SCADA fields per site.
  • Prioritize a core set: Start with a manageable list of high-value KPIs that all plants can implement, then extend as data and systems mature.

Full system replacement just to standardize KPIs is rarely justifiable in regulated, long-lifecycle plants; the qualification, validation, downtime, and integration burdens tend to outweigh the benefit. Governance around definitions and mappings is usually more practical than wholesale replacement.

Key governance practices to put in place

Regardless of structure, the following practices matter more than the exact org chart:

  • Formal charter: A short document that states the governance group’s scope, decision rights, and escalation paths.
  • Version-controlled KPI catalog: A single source of truth (for example, under document control) where KPI definitions, owners, and status are maintained.
  • Change control and impact assessment: KPI definition changes go through impact assessment, stakeholder review (including key plants), and documented approval.
  • Alignment with QMS and internal standards: KPI documentation and changes align with existing document control and validation processes, not a parallel ad hoc process.
  • Training and communication: Plants are briefed when definitions change, with examples showing old vs. new behavior and any expected shifts in reported values.
  • Periodic audit: Periodic checks that systems, reports, and local spreadsheets still reflect the approved definitions.

Summary

In a multi-plant organization, manufacturing KPI definitions should be owned by a cross-functional KPI governance group, sponsored by operations leadership and tightly linked to quality, finance, and IT/OT. Plants retain flexibility for local metrics, but the core KPIs used for comparison and management must be centrally defined, version-controlled, and subject to formal change control to remain credible, auditable, and useful.

Get Started

Built for Speed, Trusted by Experts

Whether you're managing 1 site or 100, C-981 adapts to your environment and scales with your needs—without the complexity of traditional systems.