Featured Product
This Week in Quality Digest Live
Management Features
Jake Mazulewicz
Three tips from high-reliability organizations
Dave Gilson
Getting out of the boardroom for a stroll changes how women navigate
Bob Ferrone
Saving the planet and bolstering the bottom line
Jeanne Quimby
Kids can be the source of new ideas
Matt Fieldman
Vocational programs and apprenticeships show the way

More Features

Management News
How to drive productivity with a universal and powerful 3D inspection software
Research commissioned by the Aerospace & Defense PLM Action Group with Eurostep and leading PLM providers
Improved design of polarization-independent beam splitters
New industry-recognized guidelines for manufacturing jobs
ASQ will address absence of internationally recognized ESG benchmarks
Helping organizations improve quality and performance
Leading technologies empowering the next generation of 3D engineering software solutions
EstateSpace offers digital estate management system

More News

Davis Balestracci


‘Control Charts! They Make the Blind Hear and the Deaf See!’

Once again, I find myself appreciating Deming’s hatred of statistical hacks

Published: Tuesday, July 14, 2020 - 12:03

“With data from an epidemic there is no question of whether a change has occurred. Change is everywhere. The question is whether we are getting better or worse. So while the process behavior chart may be the Swiss army knife of statistical techniques, there are times when we need to leave the knife in our pocket, plot the data, and then listen to them as they tell their story.”
Dr. Donald J. Wheeler

I agree with Dr. Wheeler’s comment about process control charts. Yet, I’m seeing far too many of them being inappropriately used as naïve attempts to interpret the mountains of questionable Covid-19 data being produced. I’ve done a few charts myself out of curiosity but none that I feel are worth sharing. Dr. Wheeler’s two recent, excellent Quality Digest articles have been the sanest things written—with nary a control chart in sight.

However, with control charts getting increased visibility, I think it’s far past time for a review of some basic concepts to stem the rising tide of what Dr. Wheeler has called “superstitious nonsense.”

Many people have had to suffer through control chart seminars (some even teaching all seven control charts), followed by a torturous discussion of “special cause tests”—usually the famous eight Western Electric rules. People are then left even more confused. Does each test signal the need to be individually investigated, i.e., treated as a special cause?

Not to worry. Most people usually automatically default to investigate only the points outside the control limits (even well-respected authors who should know better also fall into this trap). People can’t seem to get away from an obsessive focus on individual observations.

But what if there is one underlying explanation generating many of these signals that has nothing to do with individual outliers?

And then there is the all-too-casual use of the ubiquitous word “trend.”

A pair of analysts once smugly presented me with the graph shown below. (Yes, the y-scale started at 0.) It almost convinces you that there is a trend, eh?

I can almost picture Six Sigma Black Belt No. 1 scolding them: “Now, now, now. Test the data for normality, and if it passes, you need to plot that as a control chart.” Note: It does indeed pass (p-value = 0.507), but the test is totally inappropriate and irrelevant. Despite the widespread belief, normality is not a requirement for control charts!

Using standard control chart software, a typical individuals’ chart for these data is shown below:

Most packages conveniently have the famous eight Western Electric Tests programmed in as well (shown as triggered on the top of the graph): Sixteen of the 52 data points generate 30 signals

In line with the pious title of this article, a “laying on of hands” then takes place:

Given the number of special cause tests, this chart by itself makes no sense and should not be presented for general consumption.

It is only now that the real work begins, i.e., critical thinking: A good analyst would use the chart to ask the next set of questions for making sense of the situation. But where should one start? Many people would immediately want to investigate the four points outside the three standard deviation limits (observations No. 9 and Nos. 50–52).

Then Black Belt No. 2 says, “It’s obvious the control chart needs to be adjusted for the trend.” There’s plenty of customer-friendly software that will do just that:

That was obviously the solution: We’re down from 30 special cause signals to seven. Better still, there are no data points outside the limits. Now what? Perhaps investigate each signal, but again, where do you start?

The computer will do anything you want.

Regression was my favorite course in grad school, and I’m very good at it. But I rarely use it. During my 40 years as a statistical practitioner, I have never ever seen an appropriate use of a trend line on data from a service industry (e.g., healthcare) plotted over time. Never. My very distinguished colleague, the late Tom Nolan, agreed. Yet it remains the display that won’t die.

Keeping it simple

Over the years, I have developed an increasing affection for the much-neglected run chart: a time plot of your process data with the median drawn in as a reference (yes, the mediannot the average). It serves as filter No. 1 for any process data and answers the question, “Did this process have at least one shift during this time period?” This is generally signaled by a clump of eight consecutive data points either all above or below the median.

If it did, then it makes no sense to do a control chart at this time on all the data because the overall average of all these data doesn’t exist. (Sort of like: If I put my right foot in a bucket of boiling water and my left foot in a bucket of ice water, on average, I’m pretty comfortable.)

Run charts are generally taught—if they’re taught at all—as a boring prelude to what is obviously the more favored sophisticated, important, and powerful analysis a control chart allegedly offers. Many computer packages don’t even offer them as an option. It is the poor stepchild, ignored because it does not find individual special-cause observations. But that is not its purpose.

What does the run chart of these data below tell us?

With the y-axis scale a lot healthier and no control limits as a distraction, doesn’t it look like the process “needle” shifted twice—around Aug. 17 (observation No. 21) and Feb. 17 (observation No. 47)? In fact, when I asked the clients about those two dates, they looked at me (smugness gone) like I was a magician and asked, “How did you know?” Those dates coincided with two major interventions to improve this process. As the run chart demonstrates, they worked: two distinct needle bumps (step-change special cause) but hardly a continuously increasing improvement trend.

In other words, a process goes from what it is “perfectly designed” to get with its original inputs and transitions to what it is “perfectly designed” to get based on the new inputs. It eventually settles into the new average based on these inputs. This also puts the trend special-cause test (six successive increases or decreases) into perspective: A step change can manifest as a trend during the transition. It won’t continue.

The control chart should be filter No. 2—plotting the data after any shifts have been determined, which then usually reduces the number of special cause signals and results in a lot less confusion.

A good chart software package should: 1) allow you to easily toggle between run and control charts of your data; and 2) easily allow you to examine isolated “time segments” to theorize where special causes (usually step changes) occurred.

So, the correct resulting control chart is shown below:

With not a special cause to be found, other than the programmed step changes.


The clients’ original performance was 78.5 percent. Their first intervention (special cause No. 1) improved the process to 83.8 percent, and their second intervention (special cause No. 2) improved that further to 91 percent.

They had recently started yet another intervention, and based on the last four data points, it’s looking relatively promising. What would indicate success?
1. There are three immediate increases. Two or three more would be good evidence (trend transition).
2. Maybe a weekly performance will go outside the upper limit (97.8%).
3. Maybe the next four to six weeks will all be above the average (91%).
4. Two out of three consecutive weeks’ performances will be between 95.5 percent (two standard deviations above the average) and 97.8 percent (three standard deviations’ upper limit). This is a very useful test known as the two-out-of-three rule.

And, not to overreact, if performance goes down from one week to the next, be advised that it could differ as much as 8.4 percent from the previous week simply due to common cause (upper limit of the moving range chart). What could be simpler—if taught correctly?

Yet how many hours are you spending in meetings looking at “trends”?

But watch out before you declare: “Control charts! They make the mute walk and the lame talk!”

There’s this little thing called competence....


About The Author

Davis Balestracci’s picture

Davis Balestracci

Davis Balestracci is a past chair of ASQ’s statistics division. He has synthesized W. Edwards Deming’s philosophy as Deming intended—as an approach to leadership—in the second edition of Data Sanity (Medical Group Management Association, 2015), with a foreword by Donald Berwick, M.D. Shipped free or as an ebook, Data Sanity offers a new way of thinking using a common organizational language based in process and understanding variation (data sanity), applied to everyday data and management. It also integrates Balestracci’s 20 years of studying organizational psychology into an “improvement as built in” approach as opposed to most current “quality as bolt-on” programs. Balestracci would love to wake up your conferences with his dynamic style and entertaining insights into the places where process, statistics, organizational culture, and quality meet.


Control Charts

The first step is to know something about the data you desiire to analyze, i.e. was it generated by the same process? I too have seen may of the missteps fostered by software, it will do what you ask of it even if the ask is nonsensical. The humble time series chart and a histogram are a great way to begin the exploration.

Excellent and very intelligent

As a teacher of SPC I found your article very usefull

I found your ideas very clear and easy to understand

Many thanks

Control Charts put the C in DMAIC

Far too many improvement projects fail to put the Control phase into operation, resulting in the loss of the improvement.

We can haggle about run charts vs control charts, but if improvers are doing neither, it's irrelevant.

Line and bar charts of performance are the dumb and dummer of improvement charts. If you aren't willing to use control charts to monitor performance after improvement, don't bother with improvement.

Control chart software is very affordable. Just buy some. Start using the tools of quality. They might surprise you.

Three Monkeys

The use of "Percent Conformance to Goal" takes the data far out of context.  What exactly is being measured? Percent of what? Are the percentages calculated based on rational subgroups? 

Shouldn't a run chart be based on actual individual measurements?