The short answer is: more than just part numbers and documents.

Before implementing Connect 981, you usually need a defined baseline of operational, quality, and system data so the platform can support real workflows instead of becoming another disconnected layer. The required data set depends on which Connect 981 capabilities you are deploying, which systems remain system-of-record, and how standardized your processes already are.

Core data typically needed

  • Item and part master data
    Part numbers, descriptions, revisions, units of measure, status, effectivity where applicable, and any classification needed to route work correctly.

  • BOM and structure data
    Assemblies, subassemblies, component relationships, approved alternates if used, and any configuration rules the process depends on.

  • Routing or process-step definitions
    Operations, work centers, sequence logic, inspection points, hold points, approvals, and required records at each step.

  • Document-controlled content
    Released work instructions, specifications, forms, templates, drawings, and revision-controlled reference documents. If document control is weak, implementation risk goes up quickly.

  • Quality workflow data
    NCR categories, disposition paths, defect codes, causes, corrective action fields, approval chains, and links to the records that need to be preserved for traceability.

  • User, role, and responsibility mappings
    Who creates, reviews, approves, executes, and closes each process. This includes site, department, supplier, or program-specific access rules where relevant.

  • Organization and location data
    Plants, cells, lines, work centers, stock locations, supplier identities, customer or program references, and any hierarchy used for reporting or segregation.

  • Transaction and identifier standards
    Job numbers, work order numbers, serial numbers, lot numbers, operation codes, supplier references, and naming conventions. If these are inconsistent across systems, integration and traceability problems are common.

  • Integration mapping data
    Source systems, field mappings, API or file interfaces, record ownership, synchronization frequency, error handling rules, and what happens when data conflicts occur.

  • Historical data, if migration is in scope
    Open records usually matter more than full history. Many teams overestimate the value of migrating everything and underestimate the effort needed to cleanse and validate legacy records.

What matters more than volume

Data completeness helps, but data governance usually matters more than data volume. In most brownfield environments, the harder problem is not collecting data. It is deciding:

  • which system is authoritative for each record type

  • which identifiers must match across systems

  • which revisions are valid for execution

  • which records must be retained for traceability

  • how changes are reviewed, tested, and released

If those rules are unclear, implementation delays are likely even when the raw data exists.

Common readiness gaps

Typical problems before implementation include duplicate part masters, uncontrolled spreadsheet workflows, inconsistent defect codes, weak revision discipline, missing approval matrices, and poor linkage between ERP, MES, PLM, QMS, or supplier records. Connect 981 can be configured around some of this, but it cannot reliably compensate for unresolved ownership and governance issues.

Another common mistake is assuming data preparation is a one-time migration task. In practice, regulated operations need an ongoing model for version control, validation, exception handling, and change control.

Brownfield reality

In most plants, Connect 981 will need to coexist with existing ERP, MES, PLM, QMS, document control, and supplier systems. That means the implementation team should define upfront:

  • what data stays where

  • what data is replicated versus referenced

  • what events trigger updates

  • how reconciliation is performed when records do not match

  • what validation evidence is required before go-live

A full rip-and-replace approach is often not realistic in regulated, long-lifecycle environments. Qualification burden, validation cost, downtime risk, integration complexity, and the need to preserve traceability usually make phased coexistence the lower-risk path.

Practical minimum starting set

If you are trying to scope the minimum viable preparation, start with:

  • released part and item master data

  • current revisions of controlled documents

  • workflow states and approval paths

  • user roles and permissions

  • key identifiers and numbering rules

  • system-of-record decisions for each major object

  • integration field mapping for the first live processes

  • cleansed open records that must continue in the new workflow

That is usually enough to begin design and pilot work. Broader historical cleanup and deeper harmonization can often be staged, but only if the boundaries are explicit and the risk is understood.

If you want a precise answer for your site, the real question is not only what data Connect 981 needs, but which business processes you are moving first, which systems it must coexist with, and what level of validation and traceability those processes require.

Get Started

Built for Speed, Trusted by Experts

Whether you're managing 1 site or 100, C-981 adapts to your environment and scales with your needs—without the complexity of traditional systems.