Featured Product
This Week in Quality Digest Live
Lean Features
Gleb Tsipursky
Belief that innovation is geographically bound to office spaces is challenged by empirical evidence
Jamie Fernandes
From design to inspection to supply chain management, AI is transforming manufacturing
James Chan
Start the transition to preventive maintenance
Mark Rosenthal
The intersection between Toyota kata and VSM
Erin Vogen
Eight steps to simplify the process

More Features

Lean News
New video in the NIST ‘Heroes’ series
Embrace mistakes as valuable opportunities for improvement
Introducing solutions to improve production performance
Helping organizations improve quality and performance
Quality doesn’t have to sacrifice efficiency
Weighing supply and customer satisfaction
Specifically designed for defense and aerospace CNC machining and manufacturing
From excess inventory and nonvalue work to $2 million in cost savings
Tactics aim to improve job quality and retain a high-performing workforce

More News

Gregg Profozich


What Are Lean and Six Sigma? Part 3

Six Sigma principles and tools

Published: Tuesday, March 2, 2021 - 12:03

Welcome to the third installment of our series on lean and Six Sigma. As we saw in the first article, lean and Six Sigma are complementary continuous improvement methodologies that reduce the overall waste and variability, respectively, in production processes. The second article went into some depth on a few of the key principles, tools, and methodologies in lean. Here we conclude our series with a high-level discussion of Six Sigma.

There are many tools in the Six Sigma toolkit—failure mode and effects analysis (FMEA), input-process-output (IPO) diagrams, confidence intervals, histograms, Pareto charts, F-tests, design for Six Sigma (DFSS), and others—that will not be discussed here. The focus here is to discuss the statistical realities that make Six Sigma effective.

Six Sigma aims to identify and eliminate the root causes of defects and waste using statistical tools to identify the variations causing defects. In Six Sigma methodology, the only way to effectively solve a problem is to permanently eliminate its root cause.

It is a measurement-based strategy that focuses on process improvement that strives to achieve not more than 3.4 defects per million opportunities. A Six Sigma defect is defined as anything outside of customer specifications. A Six Sigma opportunity is then the total quantity of chances for a defect to occur.

Six Sigma doctrine

Six Sigma is based on three key ideas:
1. Continuous efforts to achieve stable and predictable process results (i.e., reduce process variation) are of vital importance to business success.
2. Manufacturing and business processes have characteristics that can be measured, analyzed, controlled, and improved.
3. Achieving sustained quality improvement requires commitment from the entire organization, particularly from top-level management.

Six Sigma methodology: The DMAIC process

Define, measure, analyze, improve, control (DMAIC) is a Six Sigma problem-solving methodology that is driven by data. The five-phase process is described by each letter of the acronym:

Define: Describe the problem, project goals, areas of improvement, or customer requirements.
Measure: Measure process performance
Analyze: Analyze and identify possible root causes for defects and variations
Improve: Improve operations by eliminating the root causes
Control: Quality control the improvement process

Let’s break the phases down further.

Once the problem, goal, or area of improvement is described, it is critical to attempt to identify the various input and output variables that are related to the behavior of the process. Often the output variables are the ones that are out of specifications. It is important to identify which input variables can be causing the variation with the output variables.

Once the input variables that are causing the output behavior are identified, it is possible to develop a measurement plan that provides enough data to start the analysis. This phase is where data are collected on the key input and output variables. This is also the phase where performance baselines are developed for use in measuring the improvements made later. As a rule, at least 30 observations are required to provide enough data to represent the process behavior.

Once the data are collected, they are analyzed to determine the most likely three to five potential root causes. This is accomplished by continued data collection and review, using statistical tools, plots, and charts, to understand the contribution of each potential root cause. The DMAIC process is iterative and repeats until all valid root causes are identified.

Based on the valid root causes identified in the analyze phase, the process is adjusted until the excessive variation is eliminated. The measure and analyze phases are repeated until the desired outcome is achieved.

When the desired outcome is achieved, the improvements are institutionalized so that the source of the excessive variation is eliminated. This step should be accompanied by a control plan to ensure that the outputs continue to be at an acceptable quality level. The control plan includes implementing statistical process control (SPC) to monitor the process and ensure that it continues to function properly over time. This control plan should also include countermeasures if a problem occurs.

In summary: DMAIC is a problem-solving methodology that helps the practitioner approach a problem with excess variation and systematically solve it.

Statistical process control (SPC)

SPC is a tool that measures whether a process meets product or process standards. If a process is capable and stable over time, the outcomes that the process was designed to produce will be achieved.

Let’s use an example to better understand these important concepts. Think about baking muffins.

Process capability and variation
A process is transforming inputs into outputs. In this case, the ingredients are the inputs. It is known that the oven needs to reach a given temperature for a specific amount of time with the muffin batter inside to get the ideal muffins.

Let’s assume that the oven is working correctly. It is capable of producing what we want—warm muffins baked exactly and perfectly. Does the fact that the oven is working correctly guarantee that the baked goods will come out right?

Of course not.

What if the oven is working correctly, but one’s muffin-starved, anxious children keep opening the door to see if the muffins are done yet? The muffins will turn out only half-baked when the timer goes off. This is an example of a special cause of variation: We are not getting the desired outcome because the process is out of control. The process is capable, it will give us what we want when it is employed properly, but it is not in control.

If we ensure the oven is working optimally—at the right temperature and with the oven door closed for the prescribed time—the process will be back under control. We now have a process that is both capable and in control, and we can reasonably expect it to produce an ideal dozen muffins.

If all special causes of variation such as the one described above have been eliminated, then the process is deemed in control, and all the variations that are experienced are variations that are inherent to the process itself. These include small variations in measured ingredients and slight variations in oven temperature. But the process is robust enough to produce the desired outcome even with these sources of (inherent) variation.

In summary: Process capability is a measure of how capable the process is to produce the desired outcome—i.e., it can tell us what percentage of defects the process will inherently produce if it is in control. Desired outcome: a dozen perfectly baked muffins.

Standard deviation
Now that you know how to bake the ideal muffin, we’ll introduce standard deviation.

Standard deviation (Ϭ) is a measure of variation and the number used to calculate process capability. It is calculated as the square root of the variance.

standard deviation

William Shewhart, the father of quality, began developing control charts during the early 1920s. He realized that if key process output variables were measured, and they created a distribution that would graph like the bell-shaped curve above, then the variation being displayed was random and, therefore, inherent to the process.

In other words, the process is behaving or operating in the manner it was designed to work. If the data are not random, then there must be a logic to explain that behavior. That is what a special cause of variation is.

Then there’s the Empirical Rule. This rule tells us that, for a random distribution:
• 68 percent of the observations will be within plus-or-minus one standard deviation
• 95 percent of the observations will be within plus-or-minus two standard deviations
• 99.7 percent of the observations will be within plus-or-minus three standard deviations

Shewhart also designed control charts (see below) that include control limits. Control limits are usually a distance of plus-or-minus three standard deviations from the mean. And, we know that if the data points are within the control limits, then our quality level is at least 99.7 percent good.

control chart

SPC uses this knowledge to an advantage. Two graphs (run charts) are monitored by entering data and observing where the data points are located relative to the mean (average) and the control limits. As long as the plots fall within the control limits, then the process is deemed to be in control. So, for the muffin example, the run chart on oven temperature would show the special cause of variation in temperature due to the frequent opening of the oven door as outside of the normal range—the normal range of variation being from the random temperature variation as the heat source in the oven turns on and off, for instance.

Please note that there is no attention paid to the specifications or specification limits (i.e., tolerances). A process that is in control is producing what it was designed (not necessarily intended) to produce. Thus, it may actually be producing bad outputs.

Process capability index and controls
In Six Sigma, the process capability index (Cpk) is a statistical tool used to measure the ability of a process to produce products within a client’s tolerance range. The higher the Cpk, the narrower the process distribution compared to the tolerance range, and the more uniform the output.

Cpk is calculated using the following formula, where UCL refers to the upper control limit, and LCL refers to the lower control limit:
Cpk = min(UCL - μ, μ - LCL) / (3σ)

The higher the Cpk, the better (closer to 2.0 is excellent), where a Cpk of 1.33 essentially indicates the lowest value for a process that is in control and meets specification.

By now we know if a process is capable and in control, it will, by definition, produce the outcome that the process was designed to produce. We discussed how to measure process control with SPC and the importance of keeping a process in control.

The specification limits relate to the process’s tolerances. For instance, a screw could be dimensioned to be 3 in. (3”) in diameter. But, how do we accommodate the inherent variation in the process to produce the pin? We do that by providing tolerances. It is determined that a screw of 3 in., plus-or-minus three-thousandths of an inch (0.003 in.) in diameter, is good enough. All screws within that range of diameters will work successfully for their application.

According to the National Institute of Standards and Technology (NIST): “Process capability compares the output of an in-control process to the specification limits by using capability indices. The comparison is made by forming the ratio of the spread between the process specifications (the specification ‘width’) to the spread of the process values, as measured by six process standard deviation units (the process ‘width’).”

The process capability index is used to determine how close the output is to the existing target and how consistent average performance is. Therefore, it can be used to predict future output performances and consistency.

Process capability index and standard deviation

For our purposes, all we need to know is that we can determine the ability of a process to produce good parts. This is the same as answering the following:
• What percentage of the output will meet the customer’s specifications?*
• How many screws will be larger than the 3 in. plus the three-thousandths in diameter (3.003 in.)?
• How many screws will be smaller than the 3 in. and less than three-thousandths in diameter (2.997 in.)?

You may remember that three standard deviation measures of variation signify a quality level of 99.7 percent of good work produced. This is the equivalent of a process capability measure of one.

The literature is pretty much in agreement: We need a process capability (Cp and Cpk) measure of at least 1.33.

This allows for what is known as “shift and drift,” where the normal variation of the process causes some defects that would not occur if there were no variation.

But there will always be variation. The key is to maintain good process control to avoid defects.

Again, according to NIST:
“A process capability index uses both the process variability and the process specifications to determine whether the process is ‘capable’.”

Shift and drift

The main issue in maintaining good process control is that, over time, any process will shift and drift, no matter how tight the initial settings were. When this occurs, the key point to remember is that as the process average moves, so does the entire variable range, while the specification limits stay stationary.

If the process moves beyond the specification limits, the process will be making defective products. You want to maintain index levels of 1.00 or better. This is achieved with good centering of the process average and minimizing variability.

In summary: Using Six Sigma, it is possible to understand if a process is capable and to measure process control and process capability. As long as an “in control” process is capable of producing the desired outcome (i.e., a process capability Cpk of at least 1.33), then it must perform properly as long as it is in control.

First published on the CMTC blog.


About The Author

Gregg Profozich’s picture

Gregg Profozich

With degrees in business administration in finance and business economics, Gregg Profozich is the director of advanced manufacturing technologies at California Manufacturing Technology Consulting.

Profozich is a skilled leader with more than two decades of experience across manufacturing, operations, supply chain, strategy execution, and information technology. Drawing on his background across Fortune 500 companies, startups, and consulting, he is experienced in pioneering new tools, approaches, and services to assist SMMs in improving their global competitiveness.