Featured Product
This Week in Quality Digest Live
Six Sigma Features
Donald J. Wheeler
What are the symptoms?
Douglas C. Fair
Part 3 of our series on SPC in a digital era
Scott A. Hindle
Part 2 of our series on SPC in a digital era
Donald J. Wheeler
Part 2: By trying to do better, we can make things worse
Douglas C. Fair
Introducing our series on SPC in a digital era

More Features

Six Sigma News
How to use Minitab statistical functions to improve business processes
Sept. 28–29, 2022, at the MassMutual Center in Springfield, MA
Elsmar Cove is a leading forum for quality and standards compliance
Is the future of quality management actually business management?
Too often process enhancements occur in silos where there is little positive impact on the big picture
Collect measurements, visual defect information, simple Go/No-Go situations from any online device
Good quality is adding an average of 11 percent to organizations’ revenue growth
Floor symbols and decals create a SMART floor environment, adding visual organization to any environment
A guide for practitioners and managers

More News

Eston Martz

Six Sigma

Five More Critical Six Sigma Tools: A Quick Guide

Getting familiar with these tools is a good way to get started on your quality journey

Published: Wednesday, September 6, 2017 - 11:02

The Six Sigma quality improvement methodology has lasted for decades because it gets results. Companies in every country around the world, and in every industry, have used this logical, step-by-step method to improve the quality of their processes, products, and services. And they’ve saved billions of dollars along the way.

However, Six Sigma involves a good deal of statistics and data analysis, which make many people uneasy. Individuals who are new to quality improvement often feel intimidated by the statistical aspects.

Don’t be intimidated. Data analysis may be a critical component of improving quality, but the good news is that most of the analyses we use in Six Sigma aren’t hard to understand, even if statistics isn’t something you’re comfortable with.

Just getting familiar with the tools used in Six Sigma is a good way to get started on your quality journey. In my last column, I offered a rundown of five tools that crop up in most Six Sigma projects. Here, I’ll review five more common statistical tools, and explain what they do and why they’re important in Six Sigma.

1. T-tests

We use t-tests to compare the average of a sample to a target value, or to the average of another sample. For example, a company that sells beverages in 16-oz. containers can use a 1-sample t-test to determine if the production line’s average fill is on or off target. If you buy flavored syrup from two suppliers and want to determine if there’s a difference in the average volume of their respective shipments, you can use a 2-sample t-test to compare the two suppliers. 

2. ANOVA


Where t-tests compare a mean to a target, or two means to each other, ANOVA—which is short for analysis of variance—lets you compare more than two means. For example, ANOVA can show you if average production volumes across three shifts are equal. You can also use ANOVA to analyze means for more than one variable. For example, you can simultaneously compare the means for three shifts and the means for two manufacturing locations. 

3. Regression


Regression helps you determine whether there’s a relationship between an output and one or more input factors. For instance, you can use regression to examine if there is a relationship between a company’s marketing expenditures and its sales revenue. When a relationship between the variables exists, you can use the regression equation to describe that relationship and predict future output values for given input values.

4. DOE (design of experiments)


Regression and ANOVA are most often used for data that have already been collected. In contrast, design of experiments (DOE) gives you an efficient strategy for collecting your data. It permits you to change or adjust multiple factors simultaneously to identify whether relationships exist between inputs and outputs. Once you collect the data and identify the important inputs, you can then use DOE to determine the optimal settings for each factor. 

5. Control charts


Every process has some natural, inherent variation, but a stable (and therefore predictable) process is a hallmark of quality products and services. It’s important to know when a process goes beyond the normal, natural variation because it can indicate a problem that needs to be resolved. A control chart distinguishes “special cause” variation from acceptable, natural variation. These charts graph data over time and flag out-of-control data points, so you can detect unusual variability and take action when necessary. Control charts also help you ensure that you sustain process improvements into the future. 

Conclusion

Any organization can benefit from Six Sigma projects, and those benefits are based on data analysis.  However, many Six Sigma projects are completed by practitioners who are highly skilled, but not expert statisticians. A basic understanding of common Six Sigma statistics, combined with easy-to-use statistical software, will let you handle these statistical tasks and analyze your data with confidence. 

Discuss

About The Author

Eston Martz’s picture

Eston Martz

For Eston Martz, analyzing data is an extremely powerful tool that helps us understand the world—which is why statistics is central to quality improvement methods such as lean and Six Sigma. While working as a writer, Martz began to appreciate the beauty in a robust, thorough analysis and wanted to learn more. To the astonishment of his friends, he started a master’s degree in applied statistics. Since joining Minitab, Martz has learned that a lot of people feel the same way about statistics as he used to. That’s why he writes for Minitab’s blog: “I’ve overcome the fear of statistics and acquired a real passion for it,” says Martz. “And if I can learn to understand and apply statistics, so can you.”