Time grain is the level of time detail used to store, analyze, or report data, such as by second, minute, hour, day, or shift.
Time grain is the level of time detail at which data is recorded, aggregated, displayed, or analyzed. It commonly refers to the size of the time interval used in a dataset or report, such as seconds, minutes, hours, days, weeks, or production shifts.
In manufacturing and industrial systems, time grain affects how operational events are represented across MES, ERP, historian, SCADA, quality, and analytics workflows. For example, machine state changes may be captured at a second-level grain, while production reporting may be summarized by shift or by day.
Time grain includes the temporal resolution of a record or metric. It does not, by itself, define the total time period being analyzed. A report can cover one month of data but still use an hourly or daily time grain.
Time grain matters when comparing data across systems or deciding whether a metric is suitable for a given use. Finer grain data can show short stoppages, alarms, or process variation. Coarser grain data is more common for management reporting, scheduling, financial rollups, or longer-term quality trends.
Examples in manufacturing include:
Time grain is often confused with time range and sampling rate. Time range is the total period under review, such as the last 30 days. Sampling rate refers to how often a sensor or system captures raw observations, which may be finer than the grain used for reporting. Time grain is also different from update frequency, which describes how often a dashboard or interface refreshes.
In analytics and data warehousing, the term may also be discussed alongside data granularity. Time grain is the time-specific part of that broader idea.