As already mentioned in my post “Is Digital Twin and its Value fully understood?”, I want to share some experiences around Analytics.
Analytics is a widely used term and sometimes misunderstood.
I experience many companies asking for predictive data usage, for example on machine maintenance/learning. However, once I have visited their facilities, I need to conclude that not even the first level of data usage has been established yet.
To explain the different levels, I personally like the simple but precise representation by Gartner. It represents the different maturity levels well to understand the usage of data.
Descriptive is the most realized level. A perfect example is the morning meeting at the beginning of a shift. During that meeting, KPIs are reported from the period before (SQPDC(1) meetings). However, this is not optimal, as the investigation only starts post-event.
On the Diagnostic level, the report already highlights/identifies automatically the responsible factor/reason for the deviation – as long as that specific data is measured. Depending on the type of business, this can be already quite difficult to compute. However, the event happened also in this case already and caused a negative impact.
For Predictive and Prescriptive there is a lot of upfront work needed, besides collecting data and the activities from the levels above. The sheer amount of data is only one aspect of teaching intelligence. As widely accepted criteria for characterization, the formula “V3” applies here: In addition to sheer “Volume” of data, the characteristics "Variety" and "Velocity" are needed. However, to explain this in more detail would fill already a full article on itself. But I need to agree, this is the level with which the most value can be generated.
What is striking me the most, is that
many companies willing to take this journey, usually don’t have the people, understanding and skills in-house to do it right.
This usually leads to failing projects and frustration.
I fully understand the motivation of achieving the Prescriptive-Decision Automation level, as this influences the performance and cost significantly. Nevertheless, there are very simple and rather easy aspects to implement, to achieve already a considerable improvement. One example is the real-time SPC(2) for the operator, for the quality inspector and/or shift-lead. The statistical process control ensures that the responsible people are informed early enough to take corrective actions before a negative event happens. On the other hand, SPC ensures that the “organization” does not overreact when it wouldn’t be needed. Of course, for this, the respective control limits are important and the people need to be trained in it.
SPC is in my opinion the simplest way to benefit from and very powerful to improve performance, decrease cost and establish a more stable process.
Let me know your opinion in the comments section.
(1) Safety, Quality, Performance, Delivery, Cost (2) Statistical Process Control
