Because using the same OEE formula does not mean the plants are measuring the same thing.
In most multi-plant environments, the gap is caused by differences in definitions, data capture, and operating context rather than the arithmetic itself. Two sites can both calculate Availability × Performance × Quality and still produce materially different numbers if they classify time, downtime, scrap, startup loss, rework, or planned stops differently.
Different time bases. One plant may calculate against scheduled production time, another against staffed time, and another may exclude meetings, preventive maintenance, changeovers, or engineering holds.
Different stop classifications. Microstops, waiting on material, first-piece inspection, tooling changes, quality holds, and operator breaks are often treated differently by site, line, or even shift.
Different ideal cycle assumptions. Performance depends heavily on the standard rate or ideal cycle time. If one plant uses engineered standards and another uses historical averages, the comparison is not equivalent.
Different quality counting rules. Some sites count only final scrap in Quality. Others include rework, yield loss at intermediate steps, or inspection rejects at different points in the routing.
Different automation levels. A highly instrumented line will capture short stops and speed loss that a manual line may never record consistently.
Different production models. High-mix, low-volume operations, batch processes, continuous processes, and heavily regulated inspection steps do not generate losses in the same pattern. OEE can still be useful, but direct cross-plant comparison may be misleading without context.
Different master data and routing discipline. Inaccurate work centers, obsolete cycle times, inconsistent part-family setup rules, and weak maintenance of routings will distort OEE even if plant teams believe the metric is standardized.
Different exclusion rules. Plants often remove special causes from reporting after the fact, such as customer holds, trial runs, validation batches, qualification work, or ERP scheduling gaps. Those choices change the number substantially.
Different system integration behavior. MES, SCADA, historians, machine gateways, ERP, and manual logs may not agree on order status, start and stop timestamps, scrap posting timing, or completed quantity. The formula only reflects the data it receives.
In mixed-vendor environments, cross-plant OEE differences are common. Plants often run different MES versions, different machine connectivity layers, different ERP posting patterns, and different local workarounds. That means the metric definition may look standardized in a slide deck while the source events are still inconsistent at the shop floor level.
This is also why full replacement is often not the practical answer. Replacing every execution and reporting system to force metric consistency can fail due to validation cost, qualification burden, downtime risk, integration complexity, and the fact that long-lived equipment often cannot be modernized uniformly. In regulated operations, a controlled semantic alignment effort is usually more realistic than a wholesale platform reset.
If the goal is true cross-plant comparison, standardize the operating definitions before debating the formula:
The production time model and what is included or excluded
A canonical event taxonomy for downtime, speed loss, startup loss, and quality loss
Rules for microstops, changeovers, preventive maintenance, inspection, and waiting states
The source of ideal cycle time or standard rate
How rework, scrap, and first-pass yield relate to Quality
How manual overrides are approved and traceable
Version control and change control for KPI definitions, routings, and master data
Data reconciliation rules across MES, ERP, historians, and machine data sources
Without that governance, cross-site OEE becomes a local reporting convention, not a reliable enterprise KPI.
OEE is useful for identifying loss within a given operating context. It is much less reliable as a raw leaderboard across plants with different product mix, labor models, inspection intensity, automation maturity, and data quality. A lower OEE does not automatically mean a plant is performing worse. It may mean that site records losses more honestly, runs more complex work, or includes regulated activities another plant excludes.
So the short answer is yes, your numbers can differ even with the same formula, and that is normal when definitions, data readiness, and process discipline are not harmonized. If executive decisions depend on plant-to-plant comparison, standardize semantics, trace the data lineage, and validate the calculation logic by site before treating the output as comparable.
Whether you're managing 1 site or 100, Connect 981 adapts to your environment and scales with your needs—without the complexity of traditional systems.
Whether you're managing 1 site or 100, C-981 adapts to your environment and scales with your needs—without the complexity of traditional systems.