That’s fake news. Real news COSTS. Please turn off your ad blocker for our web site.
Our PROMISE: Our ads will never cover up content.
January 13, 2022
I’m a chemical engineer. The fundamentals of the chemical engineering profession were laid down 150 years ago by Osborne Reynolds. Although chemical engineering has seen many advances, such as digital process control and evolutionary process optimization, every engineer understands and uses Reynold’s work. Most people have heard of the Reynolds number, which plays a key role in calculating air and liquid fluid flows. There are no fads. Engineers use the fundamentals of the profession.
By contrast, in the past 70 years, “quality” has seen more than 20 fads. The fundamentals have been forgotten and corrupted. Quality has been lost. Quality managers engage in an endless pursuit of magic pudding that will fix all their problems.
Alarmingly, the latest “quality” fad, Agile, has nothing to do with quality. It’s a software development fad that evolved from James Martin’s rapid application development (RAD) fad of the 1980s. This in turn grew into the rapid iterative processing (RIP) fad. When it comes to quality today, anything will do, no matter how unrelated.
Before Agile we had quality 4.0 and 5.0. Again, these fads had nothing to do with quality. They were a mishmash of whatever might catch the eye of a quality manager. Artificial intelligence... how cool... now who wouldn’t want some of that? The internet of things... trendy, but it will do nothing for reducing variation. 3D printing... again, trendy, but it has nothing to do with quality.
The most ubiquitous and destructive fad of all has been Six Sigma. It was created by Mikel Harry, who said, “Some claim that Six Sigma is no more than smoke and mirrors—a sales job for senior management and a snow job for the rest. Well, they couldn't be more on target.” He then tried to turn that statement on its head as a means to justify Six Sigma... but in truth we should have taken those words at face value. Even though Ford Motor Co., for example, was one of the first to buy into the nonsense, an eight-year survey of thousands of Six Sigma projects showed a disastrous average of 220,000 dpm after improvement for “successful” projects. Yes, that’s one in five parts that are junk.
Mass production dates back 2,200 years to China, but it wasn’t until the Industrial Revolution at the start of the 19th century that it became commonplace. Mass production brought with it the need for identical and interchangeable parts, along with control of manufacturing processes. For example, in 1860s during the American Civil War, interchangeability was key in using the Minié balls in both the U.S. Springfield and British Enfield rifles. The need for interchangeability of parts made good quality essential.
In 1870 the concept of the defect fully emerged with the development of the go/no-go gauge. It was a step of great importance but lacking with regard to process improvement. An item was either good, or it was trashed. There was nothing in between. Variation was just on-off. Quality was measured by counting defects.
The Osborne Reynolds of quality was W. Edwards Deming. He drew greatly on those who came before, particularly Walter Shewhart and Clarence Lewis. Deming created the modern definition of quality: “On target with minimum variance.” It was a massive advance on counting defects.
Understanding Deming is key to building, maintaining, and predicting good quality now and into the future. He laid the basis for quality as a profession with his System of Profound Knowledge, his famous 14 points, seven deadly diseases, plan-do-study-act, and operational definition methodologies.
Deming emphasized the importance of being able to predict the behavior of process into the future. His key tool was the Shewhart chart.
Although statistical methods have been available for more than a century, they were poorly suited to processes. In 1924 Shewhart introduced his control chart. Shewhart talked much of economics. His control chart was an economic chart, not a probability chart. “This state of control appears to be, in general, a kind of limit to which we may expect to go economically in finding and removing causes of variability,” he said. Shewhart defined his control limits as “economic limits.”
Shewhart added a key point in 1931: “In developing a control criterion, we should make the most efficient use of the order of occurrences as a clue to the presence of assignable causes.” This is not provided by classical statistics. The control chart is unique in its use of the element of time. Exactly the same data in a different time sequence give totally different results.
Even more important, in 1944 he said: “Statistical control is not mere application of statistics.... Classical statistics start with the assumption that a statistical universe exists, whereas SPC starts with the assumption that a statistical universe does not exist.” That is, Shewhart’s control charts do not depend on the nature of underlying data distributions. They do not assume or require normality.
Sadly, most people at that time, and still today, don’t understand this. Hopefully that will change in the coming years.
The Shewhart chart was a radical step forward. It rattled the statistical establishment. It created a fork in the road. Shewhart forged the new road of quality while others continued down the path of traditional statistics.
Karl Pearson called Shewhart “illogical,” while others, such as William Golomski, described him as a “hero.” Kaoru Ishikawa was “greatly impressed with the depth of his philosophy.” Ellis Ott and Deming recognized Shewhart’s brilliance and continued to build on his work. Later, David Chambers and Donald Wheeler further extended Shewhart’s work.
One of the greatest contributions to quality came from Wheeler. He proved Shewhart’s assertion by testing 1,143 different data distributions and proving that normality is not required for process behavior charts. He pointed out that we can never know the data distribution for a changing process... and that we don’t need to know!
Many prominent figures at the time, such as Joseph Juran, were dismissive and failed to understand Shewhart. Juran stated that Shewhart’s concepts were “beyond the grasp of the unsophisticated user.” Even by the 1980s, Juran still didn’t understand and was still referring to control charts as a “test of statistical significance.” He continued to produce charts of defects more appropriate to a century before.
Today, most quality practitioners fail to understand Shewhart and the process behavior chart. Popular authors, names you would even see on the pages of Quality Digest, all show a lack of understanding. Most of them don’t even discuss this central aspect of quality. Harry’s Motorola workmate Keki Bhote understood even less and in 1991 referred to control charting as “a total waste of time.”
Hundreds of thousands of practitioners and clients have read these authors’ material and have been misled. It was inevitable that quality suffered. This key tool, at the heart of quality, has been corrupted.
Future progress in quality depends on people understanding these basics. Instead, fads have proliferated, using up to 140 tools as course stocking-filler fluff.
Nothing has changed from when Ishikawa pointed out that all that anyone needs is the wise use of the seven basic tools of quality. More tools don’t give better quality.
Keeping it simple greatly benefits clients. It makes it easy for all employees to be involved in quality, as Deming advocated. Frontline workers are the real process experts. Involving them in quality isn’t difficult, if done wisely.
Like many others, the popular author Douglas Montgomery claims that process behavior charts are probability charts. He confuses specification and control limits. He throws in the nonsense term “three-sigma performance.” There’s no such thing. The performance of a process is determined by its stability, that is, whether assignable causes are present.
Most people are confused about what quality means. Hopefully, better education in the coming decades will correct this. Defects relate to the specification limits, not control limits. Specification limits can be set anywhere, to produce any level of defects. Control limits describe the behavior of the process. A process that isn’t in control has one or more assignable causes acting and may produce any amount of defects. Process behavior charts provide a warning to workers about when to investigate for such assignable causes. When such an assignable cause is corrected, it will affect the entire process. That is, it’s not necessarily simply an ephemeral event.
Montgomery’s failure to understand the nature of process behavior charts leads him to claim that for 100 parts, “...about 23.7 percent of the products produced under three-sigma quality will be defective.” He fails to understand that the process behavior chart is the voice of the process. An in-control process is predictable. However, an out-of-control process makes no prediction whatsoever about the number of defects that might be produced. The 23.7 percent figure is ridulous. Tokai Rika, for example, produces 500,000 parts per month, fully in control.
We might apply Montgomery’s probability approach to an automobile, with a typical 30,000 parts. Suppose each of these parts were built to his claim of “excellence,” at 3.4 dpmo. This would give a 1 – 0.9999966^30,000 chance of having a defect. In other words, every automobile manufactured would have a 9.7-percent chance of being a lemon. That is, 9.7 percent of all cars would contain from one to many thousands of defects. Clearly, this approach to defects and probabilities gives silly results.
Montgomery suggests that at “six-sigma quality level... the process mean can shift by as much as 1.5 standard deviations off target... to produce about 3.4 ppm defective.” Claiming 3.4 defects in the extreme tail of a nonexistent distribution of an out-of-control process is folly. Shifting the process means is a recipe for disaster.
He also suggests that to improve things, we should let the mean float around a bit. “If the mean is drifting around, and ends up as much as 1.5 standard deviations off target, a prediction of 3.4 ppm defective may not be very reliable, because the mean might shift by more than the ‘allowed’ 1.5 standard deviations.” However, if the mean is shifting, special causes are persisent. The process is out of control and may produce any amount of defects, no matter where specification limits are set.
This highlights the huge need for reeducation in the basics of process behavior charts. Hopefully in the future, people will study Deming and Shewhart rather than Montgomery et al.
A major extension of Shewhart’s work was made in 1942 by William Jennett, with his invention of the XmR, a single-point, moving-range control chart. However, it took another 50 years to be popularized.
The first great benefit of XmR charts is that they are easy. James Womack pointed out that “assembly workers could do most of the functions of the specialists, and do them much better, because of their direct acquaintance with the conditions on the line.” XmR charts are an action tool that any worker can use manually, without special software.
The second great benefit of XmR charts is that they can be used for count data as well as variable data. Shewhart had no choice other than to use specialty charts, such as P, NP, C, and U charts, for count data. However, these charts each depend on four assumptions of a data distribution (binomial or Poisson, respectively). If these assumptions aren’t met, the chart gives incorrect results. XmR charts do not depend on a data model. Wheeler points out that when a specialty chart gives different results to an XmR chart, it’s an indication that the assumptions for the specialty chart aren’t met and hence shouldn’t be used.
The message for the future is clear: Keep it simple. Use XmR.
A less well-known contribution made by Shewhart was the philosophy of quality. He read philosopher Clarence Lewis’ 1929 book Mind and the World Order (Kessinger Publishing, 2004 reprint) 14 times. When Professor Deming said he had read it seven times and still didn’t understand it, Shewhart told him to read it again. Ultimately, it led to Deming’s System of Profound Knowledge.
Shewhart is known for the Shewhart cycle. He took the scientific method and applied it to processes, using his control charts. Deming turned this into his PDSA. Shewhart also created what he called the “operational meaning,” later called the “operational definition” by Deming. This core tool should be learned and used by everyone but is rarely even mentioned:
1. What do you want to accomplish?
2. By what method will you accomplish your objective?
3. How will you know when you have accomplished your objective?
When should you use Shewhart charts? Covid-19 data provide a classic example. There are many attempts to use control charts and even logarithmic control charts for Covid-19 data. The first step of the operational definition is to ask, “What do you want to accomplish?” Folks trying to draw control charts for Covid-19 data have not asked this question.
What we want to know about Covid-19 is whether it’s getting better or worse. What we want to accomplish is a downward trend. The best method to observe such trends is the run chart, or bar chart.
Control charts can’t help answer this question. All that a control chart can show is that the data are nonhomogeneous. However we already know this because of the way the disease is spread from one person to many more people.
Process behavior charts are ideal for bringing a process to its full potential and ensuring good quality into the future. They are worthless for Covid-19 data. “Any attempt to use a process behavior chart to analyze the daily Covid-19 values is a misapplication of the technique,” says Wheeler. “It is conceptually equivalent to someone computing the average for a list of telephone numbers.”
How will you know when you have accomplished your objective? In the case of Covid-19, we’ll know this when the run chart trends to zero.
The root cause for the ubiquitous misuse of process behavior charts is training courses that fail to teach the fundamentals of quality. XmR process behavior charts make quality easy for every employee. The operational definition gives guidance about what to do.
“The best analysis is always the simplest analysis that will give the necessary insight,” Wheeler notes. In this case, the best tool is the forgotten run chart. Today, there are about 140 quality tools. Nothing has changed from when Ishikawa pointed out you only need the seven basic quality tools, if used wisely. More tools only add confusion. They don’t improve quality.
Endless fads contribute nothing to quality. There’s no justification to stray from Deming’s PDSA and his operational definition. The operational definition is simple. It cuts through the nonsense.
On one hand we have where quality should go, and on the other where it will probably go. At the current rate we should expect to see another dozen or so new quality fads in the next 40 years. Just a few days ago I saw a consultant claiming there’s a clear need for “Agile TQM Lean Six Sigma.” He obviously feels he has some clients gullible enough to buy in.
Where should quality go? Quality should become a profession in the same way chemical engineering is a profession. We need to eradicate fads and farce. Quality should get back to the fundamentals, laid down by the giants: Deming, Lewis, Ishikawa, Shewhart, and Wheeler (who has made far greater contributions to quality than any other living person and in the near future should be awarded appropriately).
The theme for future quality should be: “Keep it simple. Back to basics.”
Comments
The method on how to help people better nderstand Deming is here
https://ur.booksc.eu/book/29439978/539fc5 by Dr. Paul Stepanovich
Quality Tools
The approach is somewhat right as every evolution in quality tools proved the applicable validity. It should be the user's decision how intellegently he or she is using it.
Excellent Article
Dear Dr. Burns,
An excellent article. It should be required reading for every quality practitioner and educator. Far too many "hacks" are out there selling unnecessary complexity, and causing chaos and continued problems as a result.
Dr. Wheeler and a few others have taught me a great deal over the years once I was introduced to their work. I was amazed by how much of what I once knew that just wasn't so!
Kind regards,
Steve
Great article. Thanks!
Thank you, dear Anthony!
I want to believe that your persistence in explaining the value and sufficiency of the quality tools proposed by Shewhart, Deming, Ishikawa, and Wheeler will really enable people in companies to focus on quality.
KISS - Yes!
The problem is six-sigma is so engrained and forced onto Tier 1 by Ford and others. It's kinda like COVID-19; it won't go away.
It's a pity if it turns out
It's a pity if it turns out the way you think.