In aerospace, the system of record (SOR) should be assigned deliberately by data domain, based on who owns the data, how it changes through the lifecycle, and where regulatory evidence must be trusted during audits. This has to be done within the constraints of existing MES/ERP/PLM/QMS stacks and their integration quality.
Start with data domains, not systems
Decide SOR by data type first, then map to systems. Typical domains include:
- Product definition: part structures, configurations, approved design data
- Manufacturing definition: routings, work instructions, tooling requirements
- Execution data: work orders, as-built/as-maintained records, operator signoffs
- Quality data: inspections, NCRs, MRB, CAPA, FAI/AS9102 results
- Supply chain and materials: POs, inventory, lot/batch, supplier data
- Configuration & maintenance: serialized asset history, modifications, SB/AD status
For each domain, define one SOR. Other systems can hold copies or derivatives, but not competing “truth.”
Use clear criteria to pick the SOR
At a minimum, apply the same decision logic to every data type:
- Authoring authority: Where is the data initially created and governed? (e.g., engineering in PLM, quality in QMS, planners in ERP/MES)
- Lifecycle ownership: Which function is responsible for changes over time and approvals under formal change control?
- Regulatory and audit reliance: Which system is cited in procedures and actually used to demonstrate compliance to AS9100/AS9102 or airworthiness authorities?
- Versioning and traceability strength: Which system has adequate audit trails, electronic signatures, and configuration control for that data type?
- Operational primacy: Which system do people actually use on the floor to make decisions in real time?
The best SOR is usually the system that both owns the lifecycle and can stand on its own during an audit for that domain, not just the system that “happens to have a field” for that data.
Typical SOR patterns in aerospace (with caveats)
Patterns vary by organization maturity and vendor stack, but common assignments look like:
- Product definition & EBOM: PLM / PDM is usually SOR.
- MBOM & routings: Often ERP or MES. In more advanced digital thread setups, PLM might own MBOM with ERP/MES consuming it.
- Digital work instructions and process plans: MES or dedicated work-instruction system, especially if e-signatures and revision control are required at the station.
- Work orders & execution status: Split is common: ERP as SOR for WO identity and financials; MES as SOR for detailed execution (operation-level status, timestamps, operator signoff, detailed genealogy).
- Quality events (NCR, MRB, CAPA): QMS (or MES with integrated quality) as SOR. ERP or PLM may hold references but not be the authoritative source.
- FAI / AS9102 data: Often QMS, FAI-specific system, or MES module is SOR, with exports to customer portals as a projection, not as the SOR.
- Materials and inventory: ERP is typically SOR; MES may be SOR for intra-plant WIP location and consumption history, with periodic reconciliation.
- Serialized configuration & maintenance history: For OEMs, PLM or MES can be SOR for as-built configuration; for MRO, an MRO/maintenance system is usually SOR for as-maintained data.
These are patterns, not rules. If your PLM is weak on shopfloor usability or your MES is immature, the assignments will shift. The key is to choose deliberately and document it.
Address brownfield reality and coexistence
Most aerospace organizations cannot replace ERP/PLM/MES/QMS in one step due to validation effort, qualification, and downtime risk. That means the SOR decision must assume coexistence:
- Avoid dual-write: If two systems both allow editing the same data type, you have no real SOR. Restrict one to read-only or derivatives.
- Define data ownership by phase: Example: engineering owns the EBOM in PLM; industrialization converts EBOM to MBOM in PLM or ERP; MES only consumes the released MBOM and cannot alter it without a controlled feedback loop.
- Interface contracts: For each interface, specify which fields are mastered where, direction of flow, timing (real time vs batch), and conflict resolution rules.
- Local vs global truth: A station HMI can be the local operational view, but the SOR is still MES or QMS if that is where audit trails and approvals live.
- Change control impact: Changing an SOR is a significant project in a regulated environment; it often requires updates to procedures, training, validation, and sometimes customer approval.
Full rip-and-replace strategies that try to make one new platform the SOR for “everything” often fail due to migration complexity, certification risk, and the need to re-validate long-lived programs. Incremental, domain-by-domain SOR decisions are usually more realistic.
Make SOR decisions traceable and enforceable
Deciding is only half the work. You also need governance so the SOR model survives reorganizations and new projects.
- Create a data ownership RACI: For each data domain, document who owns, approves, changes, and consumes it. Link this to your QMS procedures where applicable.
- Maintain a master data map: For each data type, list the SOR, downstream systems, and integration flows. This is essential for troubleshooting audit gaps and integration issues.
- Align with validation and qualification: The SOR must be on systems that are appropriately validated/qualified for their role in regulated processes, with change control procedures in place.
- Control user access: If a system is not SOR for a domain, limit or remove user rights to modify that data there. Otherwise the SOR model collapses in practice.
- Monitor for drift: Periodically review where users actually work. If operators or engineers are bypassing the SOR because it is hard to use, either fix usability or formally adjust your SOR decision.
Key tradeoffs to acknowledge
SOR choices always involve tradeoffs:
- Usability vs purity: The system with the best data model may not be the one people actually use. For regulated data, evidence often favors the system that captures real usage with audit trails.
- Centralization vs resilience: Centralizing too much into one SOR can create a single point of failure; spreading SOR roles too widely increases integration and governance burden.
- Short-term convenience vs long-term traceability: Allowing multiple “truths” can feel faster initially but usually leads to nonconformance risk, rework, and difficult audits.
Practical steps to get started
- Inventory your major systems (ERP, PLM, MES, QMS, MRO, FAI tools, supplier portals) and the data they hold.
- List your critical data domains and which system currently behaves as SOR in practice.
- Identify conflicts where two systems are effectively acting as SOR for the same data type.
- For each conflict, decide which system will be SOR using the criteria above, then update interfaces, access rights, and procedures accordingly.
- Document your SOR map and integrate it into onboarding, change control, and system upgrade planning.
The goal is not theoretical perfection but a defensible, documented SOR model that aligns with how your aerospace programs actually run, can survive audits, and can evolve without breaking traceability.