Stop the Profit Drain: Pull Data Across an Entire Organization
I know a company that has 16,000 process improvement software licenses installed on the laptops of virtually every engineer that touches the manufacturer's product. These individuals, most of whom are trained Six Sigma Black Belts, are experts in analyzing data to improve yield and performance. The company is a legend of no small note in manufacturing quality.
It's a tremendous commitment to process improvement, but it holds the potential to create 16,000 silos of data and knowledge. It nourishes an environment where thousands of engineers re-invent the wheel because all the meetings in the world can't share best practices efficiently enough to keep that from happening.
And it puts the company at risk of missing the big picture. These engineers have learned to grow a robust tree, but no one is looking at the forest's health. A true enterprise-wide quality lifecycle management system can't occur in this kind of environment. It requires an integrated approach that looks at a broad set of data. Variables that seem random using a small set of data can turn out to be anything but when looked at across multiple processes, equipment sets and factories. It's about looking at the whole, not just the parts.
Culturally, looking at process improvement on a global basis is a tough sell. So much process improvement has occurred by decentralizing and de-bureaucratizing quality initiatives. Readers are surely thinking, "By the time we got all that data together our individual engineers could find and solve dozens of profit-draining problems." That might have been the case a decade ago, but global process improvement is very attainable today.
The most sophisticated manufacturers are already doing this. They have created a living archive of collective knowledge where data from across the organization is warehoused together (and accessible to everyone working on process improvement). The results of individual efforts are published and scored. Models that work the best are distributed while those that don't are discarded. There is, in essence, a virtual workbench created for collaboration across the entire process. There is very little re-work. And most importantly, the individual quality project is still very much a part of the process. Global collaboration does not preclude innovative process improvement through an individual engineer's efforts.
A Steelmaker Leads the Way
A Korean steelmaker had made sound incremental process improvement using a cadre of highly trained Six Sigma black belts. But they sensed they could make a larger impact with an enterprise wide, coordinated effort that utilized data integration and enterprise data mining and modeling. There were still large profit variables between plants and items and scrap losses were unacceptably high. Traditional, isolated process oriented analysis wasn't sufficient.
By pulling all of its data together across plant and processes, it reduced scrap ratio from 15% to 1.5% saving $150,000 on one process. It was able to identify variations in profitability by plant and item for cold roll steel delivering an annual $1.2 million return on its investment in a software solution its Six Sigma experts used to identify high value targets and form teams to attack the root causes. It also achieved a 50% reduction in lead times for standard hot coil production (from 30 to 14 days) and reduced inventory by 60%. These efforts helped reduce the planning and sales cycles and maximized production utilization. The improvements also didn't take months to accomplish -- the analytical cycle times actually dropped.
What Does Your Company Need to Do to Get Started
To achieve an end-to-end understanding of your process requires five capabilities -- all of which can be implemented with readily available information technology. These capabilities include:
- An up to the minute, unified view of all relevant data -- Many companies already have data warehouses storing information coming from multiple sources even individual desktop computers. Oftentimes, they aren't doing anything with it. Unified data integration provides "one version of the truth" and replaces disparate spreadsheets, individual application tools and proprietary custom solutions.
- A rigorous framework for historical analysis -- Everyone can create a trend chart or a bar chart. End to end improvements require testing and managing models and encoding business rules and criteria for analysis.
- Tools for proactive analysis and action -- You want to catch mistakes before the batch is finished or shipped. An automated monitoring and alert mechanism scans operating data, flags potential problems by modeling against pre-established business and engineering rules and automatically issues alerts for corrective action. Predictive analysis is forward facing and is far more valuable than trying to figure out what happened months and millions of dollars later.
- A living archive of collective knowledge -- You want everyone on a team to know what each person knows. A knowledge repository stores the results of historical and proactive analysis, complete with cross-functional context. This supports collaborative learning within and between teams and avoids rework.
- Knowledge available to all stakeholders -- Not just the data, but the analysis of it, must be available to everyone from line managers to executives in a form that is meaningful to them.
Incremental improvement is always possible. But to make true company-wide improvement requires a collaborative process that pulls in data across the organization and looks at it as a whole. It doesn't decrease the value of individual improvements -- it magnifies them creating less waste and stronger profit.
Michael Newkirk is a Product Marketing Manager for SAS in Cary, N.C. SAS is a provider of business intelligence applications with particular expertise in data quality and analytics. http://www.sas.com/
Interested in information related to this topic? Subscribe to our Information Technology eNewsletter.