Wednesday, February 14, 2018

Is my color process going awry?

This blog is a first in a series of blog posts giving some concrete examples of how the newly-invented technique of ColorSPC and ellipsification can be used to answer real-world questions being asked by real-world people about real-world problems for color manufacturers.

So, picture this scenario. I am running a machine that puts color onto (or into) a product. Maybe it's some kind of printing press; maybe it mixes pigment into plastic; maybe this is about dyeing textiles or maybe it's about painting cars. The same principles apply.

John the Math Guy really Lays color SPC on the line

Today's question: I got this fancy-pants spectrophotometer that spits out color measurements of my product. How can I use it to alert me when the color is starting to wander outside of its normal operating zone?

An important distinction

There are two main reasons to measure parts coming off an assembly line:

     1. Is the product meeting customer tolerances?

     2. Is my machine behaving normally?

Conformance and SPC (statistical process control). These are intertwined. Generally, one implies the other. But consider two scenarios where the two answers are different. 

It could be that the product is meeting tolerances, but the machine is a bit wonky. Not wonky enough to be spitting out red parts instead of green, but there is definitely something different than yesterday. Should we do anything about this? Maybe, maybe not. It's certainly not a reason to run out of the building with our hair on fire. But it could be your machine's way of asking for a little TLC in the form of preventative maintenance.

Or it could be that your machine is operating within its normal range, and is producing product that is outside the customer tolerances. This the case you need to worry about. Futzing with the usual control knobs ain't gonna bring things in line. You need to change something about your process.

Use of DE for SPC

The color difference formulas, such as DE00, were designed specifically to be industrial tolerances for color. While DE00 may well be the second ugliest formula ever developed by a sentient being in this universe, it does a fair job of correlating with our own perception of whether two colors are an acceptable match. 

But is it a good way to assess whether the machine is operating in a stable manner? I mean, you just track DE over time, and if it blips, you know something is going on. Right? Let's try it out on a set of real data.

The plot below is a runtime chart of just over 1,000 measurements of pink spot color that I received from Company B. These are all measurements from a single run. I don't know for sure what the customer tolerance was, but I took a guess at 3.0 DE00, and added that as an orange dashed line.

It sure looks like a lot of measurements were out of tolerance!

Uh-oh. It looks like we got a problem. There are a whole lot of measurements that are well above that tolerance... maybe one out of three are out of tolerance?

But maybe it's not as bad as it looks. The determination lies in how one interprets tolerance. Here is one interpretation from a technical report from the Committee for Graphic Arts Technologies Standards (CGATS TR 016, Graphic technology — Printing Tolerance and Conformity Assessment):

"The printing run should be sampled randomly over the length of the run and a minimum of 20 samples collected. The metric for production variation is the 70th percentile of the distribution of the color difference between production samples and the substrate-corrected process control aims."

TR 016 defines a number of conformance levels. (For a description of what those values mean, check out my blog on How Big is a DE00) It says that 3.0 DE00  is "Level II conformance", so the orange dashed line is a quite reasonable acceptance criteria for a press run. But a runtime chart is not at all useful for identifying those "Danger Will Robinson" moments. I mean, how do you decide if a single measurement is outside of a tolerance that requires 20 measurements? 

If we want to do SPC, then we must set the upper control limit differently.

Use of DE for SPC, take 2

The basic approach from statistical process control -- the whole six sigma shtick -- is to set the upper control limit based on what the data tells us about the process, and not based on customer tolerances. It is traditional to use the average plus three times the standard deviation as the upper limit. For our test data set, this works out to 5.28 DE00.

The process looks in control now!

This new chart looks a lot more like a chart that we can use to identify goobers. In fact, I did just that with the two red arrows. Gosh darn it, everything looks pretty good.

But I think we need a bit closer look at what the upper limit DE means. The following pair of plots give us a perspective of this data in CIELAB. The plot on the left is looking down from the top at the a*b* values. The plot on the right is looking at the data points from the side with chroma on the horizontal axis and L* on the vertical.

The green dots are each of the measurements. The red diamond is the target color, and the ovoids are the upper limit tolerances of 5.28 DE00. (Note: in DE00, the tolerance regions are not truly ellipses, but are properly called ovoids. One should ovoid calling them ellipses, and also ovoid making really bad puns.)

Those are some big eggs!

The next image is  closeup of the C*L* plot, showing (with red arrows) the small set of wonky points that were identified with the DE runtime chart. I would say that these are pretty likely to be outliers. But look at the smattering of points that are well outside the cluster of data points, but are still within the ovoid that serves as the upper limit for DE. These should have stuck out in the runtime chart, if it were doing its job), but are deemed OK.


Now, listen carefully... If you are using a runtime plot of "DE00 from the target color", you are in effect saying that everything within the ovoids represents normal behavior for your process. So long as measurements are within those ovoids, you will conclude that nothing has changed in your process. That's just silly talk!

Here is my summary of DE runtime charts: JUST SAY NO! Well... unless your are looking at conformance, and your customer tolerance is an absolute, as in, "don't you never go above 4 DE00!"

Use of Zc for a SPC

I know this was a long time ago, but remember the Z statistic from Stats 101? You compute the average and standard deviation of your data, and then normalize your data points to give you a parameter called Z. If a data point had a Z value that was much smaller than -3, or much larger than +3, then it was suspicious. This is mathematically equivalent to what's going on with the upper limit in a runtime chart.

I have extended this idea to three-dimensional data (such as color data). I call the statistic Zc. This is the keystone of ColorSPC.

Now, remember back when I showed the CIELAB plots of the data along with a DE00 ovoid? Didn't you just want to grab a red pencil and draw in some ellipses that represented the data better? That's what I did, only I used my slide rule instead of a pencil. There is a mathematical algorithm that I call ellipsification that adjusts the axes lengths and orientation of a three-dimensional ellipsoid to "fit" the data. Ellipsification is the keystone of ColorSPC.

Ellipsification charts in CIELAB

The concentric ellipses in the drawings above are the points where Zc = 1, 2, 3, and 4. That is to say, all points on the innermost ellipse have Zc of 1. All points between the innermost and the next ellipse have Zc between 1 and 2.

Zc is a much better way to do SPC on color data. Here is a runtime plot of Zc for this production run. The red dashed line is set to 3.75. That number is the 3D equivalent of the Z = 3 upper limit used in traditional SPC.

Finally, a runtime chart we can believe!

As can easily be seen (if you click on the image, and then get out a magnifying glass) this view of the data provides us with a much better indication of data points which are outside of the typical variation of the process. Nine outliers are identified, and many of them stick out like sore thumbs. Kinda what we would expect from the CIELAB plots.

But wait!

In the previous DE analysis, we computed DE from the target value. In a paper by Brian Gamm (The Analysis Of Inline Color Measurements For Package And Labels Printing Using Statistical Process Monitoring Techniques, TAGA 2017), he pointed out this problem with DE runtinme charts, and advocated the use of the DE, but with DE measured from the average L*a*b* value, rather than the target. The graphs below show the result of this analysis on our favorite data set.

DE00 ovoids based on computing color difference from average

Addendum Feb 22, 2018: 

I would like to update the previous paragraph based on conversations with Brian.

First, he wanted to reiterate something that I have said before, and which bears re-reiterating. Looking at a runtime chart of DE is the correct thing to do when you are doing QA -- if your question is "did my product meet the conformance criteria from my customer?" But his paper (and this blog post) show that DE is not the proper tool for finding aberrant data. Both are necessary and useful.

Second, he advocated something a bit different than what I said. Subtle, but important difference. I said "... but with DE measured from the average L*a*b* value". Brian advocated "... but with DE measured from the initial L*a*b* value". Brian is looking at the drift during a production run. The assumption is made that color was dialed in pretty decent at the start, but may be gradually changing over time.

Thanks, Brian!

It is interesting to note that the DE00 ovoid in a*b* (on the left) is similar to the to the ovoid produced by ellipsifcation. Larger, and not quite as eccentric, but similar in orientation. This is a good thing, and will often be the case. This will not be the case for any pigments that have a hook, which is to say, those that change in hue as strength is changed. This includes cyan and magenta printing inks.

However, it can be seen that the orientation of the DE00 ovoid in C*L* (on the right) does not orient with the data in orientation. This is soooo typical of C*L* ovoids!

So, DE00  from the average is a much better metric than DE00 from target color. If you have nothing else to use, this is preferred. If you are reading this shortly after this blog was posted, and you aren't using my computer, then you don't have nothing else to use, since these wonderful algorithms have not migrated beyond my computer as I write this. I hope to change that soon.


For the purpose of conformance testing, there is no question that DE is the choice. DE00 is preferred to ΔEab(or even DECMC  or DE94  or DIN 99).

For the purpose of SPC -- characterizing your color process to outliers -- the Dfrom target metric is lousy. The use of DE from average is preferable, but the best metric is Zc, which is based on Color SPC and fitting ellipses to your data.

No comments:

Post a Comment