Featured Product
This Week in Quality Digest Live
Six Sigma Features
Donald J. Wheeler
What are the symptoms?
Douglas C. Fair
Part 3 of our series on SPC in a digital era
Scott A. Hindle
Part 2 of our series on SPC in a digital era
Donald J. Wheeler
Part 2: By trying to do better, we can make things worse
Douglas C. Fair
Introducing our series on SPC in a digital era

More Features

Six Sigma News
How to use Minitab statistical functions to improve business processes
Sept. 28–29, 2022, at the MassMutual Center in Springfield, MA
Elsmar Cove is a leading forum for quality and standards compliance
Is the future of quality management actually business management?
Too often process enhancements occur in silos where there is little positive impact on the big picture
Collect measurements, visual defect information, simple Go/No-Go situations from any online device
Good quality is adding an average of 11 percent to organizations’ revenue growth
Floor symbols and decals create a SMART floor environment, adding visual organization to any environment
A guide for practitioners and managers

More News

Davis Balestracci

Six Sigma

Wasting Time With Vague Solutions, Part 3

You’ve exhausted in-house data. Now what?

Published: Wednesday, September 19, 2012 - 11:35

Editor’s note: This is the third of a three-part series on effective, focused data analysis. Part one discussed helping management deal with common cause; the first common cause strategy—stratification—was discussed in part two.

In my last column, I introduced some aspects of common cause strategies using Juran’s wisdom of “exhaust in-house data.” That involved: defining recurring problems, assessing the effect of each problem, and localizing each major problem. I suggested these as a preliminary process for change agents to organize data before getting other people involved;  in doing this, you will no doubt come to some conclusions of your own. The next step is to discuss conclusions with key players. Do the results of your in-house data analyses seem logical to the people involved? Are there obvious changes that would eliminate the problem? Are there obvious ways to prevent similar problems in the future?

Sometimes people closest to the processes will have obvious “no brainer” solutions that, no doubt, isolated people have suggested in the past. The problem was that these ideas needed management support to be implemented. You have now focused a critical situation that sets their ideas up for success (and you will ensure that they get all the credit, right?). 

OK, that gets part of the problem solved. Before moving on, it’s probably a good idea to get people’s input to assess the value of continuing to collect data in this form; you may as well improve the current data collection process while you’re at it. More important, involve these people in the next phase.

You need more data

It’s time to ask: What other questions still need to be answered? What other data would be useful? How can you get them? What additional key players might be helpful in designing such a data collection? What might these additional data tell you about the occurrence of this problem?

We are now entering the realm of a designed data collection process. Once again, Juran’s wisdom will save the day. Data collection is indeed very important, but there is a tendency to overuse it or collect too much while you’re in the early stages of understanding an improvement opportunity. 

Exhausting your in-house data has hopefully shed some light and focus on what data are truly needed to continue. The next phase of data collection must somehow further isolate the “20 percent of the process causing 80 percent of the problem,” and do it easily. Once that occurs, more serious “drill downs” can be considered. With that in mind, beware!

If the data collection is not carefully designed and defined, it is easy to accumulate too much data with vague objectives. This leaves people scratching their heads about what they should do with it. And as I’ve found many times, sometimes any resemblance between what you intended and what actually gets collected also seems purely coincidental.

When data collection objectives and methods are unclear, there is a serious risk of unduly inconveniencing the work culture involved in the collection. Those providing data understandably become peeved when told, “We’re sorry. We forgot to think about ___________. Would you mind doing it again?” Credibility for the project is now lost as well as credibility and cooperation for future collections and projects. So, to “get the respect you deserve,” it’s time to apply Juran’s second strategy.

Juran’s strategy No. 2: Study the current process

A preliminary in-house data analysis many times involves stratification of an important output—tracing it to its input sources. Through minimal additional documentation of normal work, further stratification and subsequent Pareto analysis can expose potentially deeper, significant sources of variation.

“Studying the current process” allows work processes to proceed normally while recording data that, although not routinely needed, are virtually there for the taking. It’s almost like taking a time-lapse video of the work as it occurs.

If no in-house data are available, this will obviously become your first strategy. Sometimes it can be just as simple as observing the process over time and “plotting the dots” to get a process baseline. The dialogue in planning this process will also be helpful for subsequent data planning.

Juran’s first two strategies allow stratification to expose many sources of special causes. This will result in very focused, localized efforts to fix the obvious-but-previously-hidden problems. This is also useful in dealing with ongoing mistakes in executing procedures, and it will address current practices that fail to recognize the need for preventive measures.

Solutions to these problems frequently involve redesigning the process to include an error-proofing element.

Identifying deep causes of variation can also expose unnecessary steps, rework, and wasteful data collection. Examples include extra paperwork (especially inspections and pseudo-inspections) and time buffers that merely condone an inefficient process.

This wisdom of Juran that I’ve used in this three-part series is contained in his seminal 1964 book, Managerial Breakthrough (McGraw-Hill, revised ed. 1995). Do you sense an uncanny resemblance to Six Sigma and lean? Two more strategies remain.

But first: You’ve now got a nice data collection planned with good, focused objectives. Have you considered data collection itself as a process—with a lot of potential (human) variation that could compromise the quality of the data you collect?

We will detour from Juran to discuss more about that next time.


About The Author

Davis Balestracci’s picture

Davis Balestracci

Davis Balestracci is a past chair of ASQ’s statistics division. He has synthesized W. Edwards Deming’s philosophy as Deming intended—as an approach to leadership—in the second edition of Data Sanity (Medical Group Management Association, 2015), with a foreword by Donald Berwick, M.D. Shipped free or as an ebook, Data Sanity offers a new way of thinking using a common organizational language based in process and understanding variation (data sanity), applied to everyday data and management. It also integrates Balestracci’s 20 years of studying organizational psychology into an “improvement as built in” approach as opposed to most current “quality as bolt-on” programs. Balestracci would love to wake up your conferences with his dynamic style and entertaining insights into the places where process, statistics, organizational culture, and quality meet.