While the idea of a black box is intriguing, it can miss the point. Placing information into a black box that spits out a solution is valuable but understanding what’s going on in that black box is vastly more valuable.
“If operators at a manufacturing plant understand how the software program analyzed the data and came to the conclusion that it did, the operator can use that as a learning process and continue to improve,” explains Berk Birand, CEO of Fero Labs.
The company, which uses explainable machine learning to help factories optimize processes, reduce emissions, reduce costs, and optimize quality, is discovering that getting newer operators trained quickly is a high-value use of this technology.
Explainable machine learning has three key principles – statistical, causal and it can be corroborated. By explaining the results in terms of a range of statistical outcomes, it can help operators understand the factors behind the decision and therefore offer the ability to learn from various outcomes. The solution should determine the root cause of the issue and which variables are causally linked. On this principle, Birand points out that quality issues can cost up to 40% of total operations, root cause analysis of a problem is essential and machine learning software can provide a diagnosis much quicker than a human being. And the ability to interact with the software is important. An operator can run through different testing scenarios to seek the outcomes.
“With explainable machine learning, an operator can have a conversation with the software. For example, if we add more carbon to the steel what happens,” says Birand. He uses this example as his company has been working with Gerdau, the largest long steel producer in Latin America, for three years. Across six of the company’s plants, they are using this software to improve the quality of the steel.
Using machine learning the software is able to adapt to changes on the factory floor at a level that would be impossible for humans to do. But what humans can do is use this software to learn at a faster pace. “We are working with companies who are all facing the same problem,” says Birand. "The very senior experience people are either leaving now or will be leaving soon and there aren’t enough new people to replace them. And when there are new people, the traditional way of training would involve a number of years. But using explainable machine learning, we can address this in an elegant way and get people up to speed much quicker.”
For example, if a company has a very seasoned operator on one shift but is unable to find that level of experience for a second shift, data from the experienced person can be placed into software for the less experienced person to use. The software knows the optimal settings for a process and uses that information to help an inexperienced person understand the process. An operator is not required to have a data science background to enable them to learn algorithms. Additionally, as the software is deployed in real-time, it is continuously updating and improving the process without having to call in more senior people to make the adjustments.
Taking this to the next level, you can create a virtual factory floor for the less experienced person to learn from. “In fact, with today’s digital twins operators can learn directly about the systems and receive training without having to be on the actual machines,” says Birand.
Simulation is a quick training method and one that particularly appeals to the younger workers. They are adept with technology and desire to have it part of their daily workflow. And this type of job fits well with workers who want process improvement to be part of their jobs. Having direct involvement with machine learning provides employees both with valuable skills but also creates a closer tie to how work is being done.
While there is the ever-present fear that this technology will replace humans, Birand says that is far from the case. “We can build technology but ultimately it’s not enough. Machine learning can’t reason the risk-reward analysis or ‘trust their guts,' it’s the humans that are good at making judgments.”