In the brave new world of the Industrial Internet of Things (IIoT), many companies are discovering that they have better connectivity and access to a wide variety of data generated by their business and manufacturing computing systems. This digital information is a product of databases used to order products, manufacture them, and even ship them. With all this information at hand, manufacturers are beginning to ask how the data could be used (mined) to improve a whole host of company metrics, like lead times, yields, supply chain efficiency, etc. With most businesses that utilize computers across their product delivery stream, the key is to cull out the valuable information from the massive amounts of less valuable data at hand. As you might imagine, this becomes a daunting challenge.
To demonstrate this issue, let me offer the following fictional case study, based on an actual manufacturing model. Consider a company that extrudes film for the packaging industry. I choose this example, because packaging films are typically produced using roll to roll manufacturing methods, a platform that we at Optimation have some expertise in. Let’s focus on the actual extrusion process that will produce the packaging film, and the data that a typical production system for the film might generate.
At a high level, our film producer will buy plastic resin from his suppliers of several types along with other possible ingredients for functional reasons (mechanical properties, coloring, lubricity, etc.). The incoming resin will be received with supplier batch information, stored, then dried, blended, mixed, and delivered to the line to be extruded into film. When the resin is extruded, process conditions like extruder RPM, temperatures, and pressures will be controlled to produce a hot liquid film of the right thickness and performance characteristics. The liquid sheet of film from the extrusion die is pumped onto a chill roller, that cools and solidifies the resin into the product web. The chilling step would be equipped with process controls for chill water pumps, pressure and flow control valves, and heat exchangers. On-line instrumentation like thickness and moisture gages might be used to feedback a control loop to the extruder, or to track product quality. If the film is to be oriented, an in line stretching operation will be utilized. The plastic web in final form would then be wound up into a roll, in a manner that creates a unit load of product suitable for post processing (converting). The conveying and winding of the product roll requires control of web tension, steering, winding shaft torque, etc.
This complex process uses (downloads, stores, measures, etc.) data in a variety of categories. Four major groupings are: recipe information, machine set up conditions, machine run parameters and product quality measures. Throughout the production equipment, the control systems (which can be at the machine/ shop floor level, and at a supervisory level running above the machine control level) use values input by other interacting systems or by the operator to establish set points; the instrumentation (sensors, measurement devices) utilized in the machine control loops generate signals that are input to the controls and play a major role in maintaining the run conditions or states of the equipment. The control system is able to maintain the process run conditions around the run parameters (which have upper and lower control limits) that are established with the expectation that good product will be produced.
To help put this in perspective, consider one wound roll of finished packaging film at the end of the production line. Let’s assume that the roll is 20,000 feet long and took 20 minutes to produce at a run rate of 1,000 Feet Per Minute. Our model production line, from recipe download and resin input through the extrusion and chilling steps, then stretched, measured for moisture and thickness, and finally wound up, may have one hundred or more process data channels handling information during the production of the roll. Each of these channels, with input devices, has a scan rate (the frequency at which a data value is measured and recorded). If we assume we are using Programmable Logic Controllers (PLC)s at a modest scan rate of 200 milliseconds, that means that each channel will generate 6000 readings during the production of our roll, or 600,000 data points worth of information that capture what happened during the manufacture of the roll.
Now let’s consider that we are outputting 3 rolls an hour, around the clock, so in our 5-day, 3 shift operation, we manufacture 360 rolls. To make our analysis more realistic, let’s also assume that our quality assurance step is done off-line after the fact, and that at the end of the week, we have learned from our QC manager that 30 rolls have failed a specific quality check and are not salable. Of course, we will want to know what happened??!! This begs the question, what do the production records of 600K data points for each roll tell us about potential root causes? How can we mine this information in a way that might help us adjust our process and thereby re-mediate the conditions that resulted in the rejected product?
At Optimation, in collaboration with Eastman Kodak, we have access to a tool called Process Monitor, that can be applied to this daunting task of analyzing the massive amount of raw data in search of connection(s) between measured defects and anomalies in the process run data set. The Process Monitor tool performs regression analysis across a statistically meaningful number of events (roll data-sets), seeking to display statistical correlations between defects of the variant product rolls and the production process information/data. The analysis performed by Process Monitor, while it may not isolate the actual problem cause, will relate statistically meaningful deviations in the data steams from one or several channels, across the sample population, to the measured defect. This mathematical link becomes the pointer that is then used to troubleshoot suspect process area, allowing an effective and focused improvement to be implemented.
The power of the Process Monitor program is in its ability to find trends where it would be virtually impossible for a process engineer to discern on his own. The tool allows the user to sift out useful data suitable for process and business gains, much like ore that is mined needs is refined to ultimately deliver value back to the investor.