DIGITIZED, MONITORED, STORED AND ANALYSED
Widespread adoption of IIoT continues to grow rapidly. A 2020 study by Jupiter Research predicts that the number of IIoT connections will more than double from 17.7 billion in 2020 to 36.8 billion in 2025.
This high availability of operational data, and the insights that can be gained through analytics tools and the use of AI, means that the vision of the ‘smart factory’ is well on its way to being realised. Eventually, every facet of industry will be digitized, monitored, stored, and analysed. Industrial companies will be able to achieve higher levels of safety, efficiency and automation that were not possible before.
However, there are still some barriers to making this a reality, and arguably, the biggest one is poor data quality. A report by Siemens that surveyed 500 senior executives in industrial sectors, found that 73% of respondents felt that “data integration and data quality issues are major or moderate barriers to adoption of AI.” The same report also found that 50% of respondents that were in manufacturing were classed as ‘AI delayers’ rather than ‘AI adopters’ – the highest percentage of all the sectors that were surveyed.
“Data integrity has become the number-one inhibitor to maximizing ROI in digital solutions for industrial organizations, with the majority of today’s data science budgets dedicated to just cleaning up data quality.”
The Cost of Poor Data Quality
Poor quality OT data is also costing companies serious money in the present. A global chemical manufacturing company (now an APERIO customer) found that poor data quality was the central cause for an increasing number of asset failures, a lowered OEE, and annual losses of around $60 million. But what exactly is poor quality data and how do these losses play out for industrial companies?
In industrial environments, poor quality data falls broadly into two categories: The first is defined as operational data that does not reflect the true state of the facilities’ assets. This type of data is often created when sensors begin to introduce ‘bad’ data into the system due to sensor drift, fouling, misconfiguration, or a host of other reasons. This is often a gradual process that goes undetected by operators and it is often allowed to continue for extended periods.
The second type of poor data quality is created when operational data is formatted or labelled incorrectly, or the sensor data is fragmented across disconnected applications. This causes interoperability issues and creates silos of data that are never accessed or are unusable.
Over time, poor quality data makes it virtually impossible to predict the reliability or failure rate of assets. This is the reason why predictive maintenance efforts often fail. In other cases, the poor quality of operational data is the root cause of sub-optimal yields, wasted raw materials, and expensive data cleaning efforts.
Rigorous Data Integrity
Jonas Hellgren, CEO of APERIO explains: “Data integrity has become the number-one inhibitor to maximizing ROI in digital solutions for industrial organizations, with over 50 percent of today’s data science budgets dedicated to cleaning up data quality.”
Another challenge that industrial companies face is managing operational data is a logistical one—as the volume of incoming sensor data rises, it becomes more complex for operators to monitor it effectively and at scale. As a result, we often come across cases where asset failures could have been predicted, if only the data variations were noticed early enough. In other cases, there’s a lack of trust in the data, and so, operators simply opt to rely on their experience rather than the data, due to increasing complexity or previously experiencing a high number of false positive alerts.
It is clear that digitization efforts in industry are accelerating. However, companies can only turn operational data into actionable insights by first taking an honest look at the state of their data quality and their ability to process it accurately in real-time and at scale. Once companies reach a state where they have full confidence in the quality of their operational data, even existing AI-based tools can make a profound impact on efficiency, safety, and automation.