No. Although the terms are used synonymously or interchangeably, at APERIO we define data integrity as having four elements:
• Data Quality: Ensure validated, reliable, clean data across the enterprise, as measured by DQI
• Data Security: Identify security threats on equipment through asset data visibility in real time
• Data Intelligence: Empower operators to trust the data and make better decisions with confidence
• Data Value: Achieve sustainability and business goals (via optimization, predictive modeling, AI, etc.) based on validated data
APERIO offers more than just a data quality or asset health check. We ensure your data is accurate, reliable, and complete, before using it in any application. See Applications below to learn more.
The source of inaccurate or unreliable data is not necessarily the result of entering data into a system without controls. Beyond bad data, the quality of that data could also be in question. For example, it could be the structure of the data, different units of measurement, data that’s out of range, or abrupt changes to the data, to name a few. Review APERIO’s machine learning engines to see all types of anomalies it can detect.
Companies have failed solving data integrity issues for decades because it’s such a huge problem. It’s just too difficult to automatically detect anomalies at scale and in real time while preventing false positives. We asked customers, with all the available technology today, why they can’t do it. They say it’s too expensive to manually manage and sensitize 1M–2M tags. And we don’t have the resources to do this at scale.
Patented Machine Learning defines normal time series behavior based on historical data via the ‘Fingerprinting’ process.