Poor quality OT data is also costing companies serious money in the present. A global chemical manufacturing company (now an APERIO customer) found that poor data quality was the central cause for an increasing number of asset failures, a lowered OEE, and annual losses of around $60 million. But what exactly is poor quality data and how do these losses play out for industrial companies?
In industrial environments, poor quality data falls broadly into two categories: The first is defined as operational data that does not reflect the true state of the facilities’ assets. This type of data is often created when sensors begin to introduce ‘bad’ data into the system due to sensor drift, fouling, misconfiguration, or a host of other reasons. This is often a gradual process that goes undetected by operators and it is often allowed to continue for extended periods.
The second type of poor data quality is created when operational data is formatted or labelled incorrectly, or the sensor data is fragmented across disconnected applications. This causes interoperability issues and creates silos of data that are never accessed or are unusable.
Over time, poor quality data makes it virtually impossible to predict the reliability or failure rate of assets. This is the reason why predictive maintenance efforts often fail. In other cases, the poor quality of operational data is the root cause of sub-optimal yields, wasted raw materials, and expensive data cleaning efforts.