Not only the speed of information is crucial, but also the quality of the data provided. Without guaranteeing a certain level of data quality, Business Intelligence is, in the best case, untrustworthy. Worse, it could lead to the wrong decision. Strategies to ensure high quality data start at field level, where sensor data is acquired.
Three criteria for data quality
- Enough, but not too much:
Yes, the amount of data is a factor in data quality. Even if analytical tools are constantly evolving and in cloud applications hardware power is easily scalable, it is a lot easier to find the needle if you remove the haystack first.
- Consistent and complete:
Applications in quality control are a good example for the importance of data consistency, whether in manufacturing, disaster prevention or electrical grid control. Reports may look fine, even when a critical value which, by chance, is out of range and is exactly the one missing from the database.
It nearly seems too obvious to mention that data needs to be correct to serve for correct analyses. Often it is a quite complex challenge to deliver correct data from field level systems. Especially when underlying data is already a derivative of primary data, like availability times of production equipment or infrastructure components.
To ensure data quality, straton and zenon offer a wide range of capabilities for on-the-fly data preprocessing. Data is being checked and processed on different levels while it is being moved upstream.