Featured Product
This Week in Quality Digest Live
Quality Insider Features
Master Gage and Tool Co.
Why it matters for accurate measurements
Ian Wright
MIT and ETH Zurich engineers use computer vision to help adjust material deposition rates in real time
Scott A. Hindle
Part 4 of our series on SPC in the digital era
Etienne Nichols
It’s not the job that’s the problem. It’s the tools you have to do it with.
Lee Simmons
Lessons from a deep dive into 30 years of NFL and NBA management turnover

More Features

Quality Insider News
Exploring how a high-altitude electromagnetic pulse works
High-capacity solution using TSMC’s 3DFabric technologies
EcoBell paints plastic parts with minimal material consumption
August 2023 US consumption totaled $219.2 million
New KMR-Mx Series video inspection system to be introduced at the show
Modern manufacturing execution software is integral for companies looking to achieve digital maturity
Study of intelligent noise reduction in pediatric study
Results are high print quality, increased throughput

More News

Davis Balestracci

Quality Insider

Four Data Processes, Eight Questions, Part 2

Minimizing human variation in quality data

Published: Friday, October 12, 2012 - 15:51

Human perception of variation and how we execute the methods of four data processes—measurement, collection, analysis, and interpretation—were discussed in part one of this column. Because human variation can compromise the quality of data and render any subsequent analysis virtually useless for project purposes, its effects must be anticipated and minimized. To deal with this, eight questions need to be addressed.

The first four questions: reducing human variation in design

• Why collect the data? Is there a clear objective for this collection?
• What method(s) will be used for the analysis? This should be known even before one piece of data is collected.
• What data will be collected? What specific process output(s) does one wish to capture?
• How will the data be measured? How will one evaluate any output to obtain a consistent number, regardless of who measures it? Remember, there is no “true value,” and it depends on the specific objective.

The second four questions: reducing human variation in collection logistics

To reiterate the point in the second paragraph of part one this article: As in any process, the collection process itself has the six sources of process inputs—people, methods, machines, materials, measurements, and environments—and is vulnerable to outside variation to compromise its quality.

Once you’ve answered the four questions above, you’ve got a great data design. However, one cannot ignore the need to formally plan for data consistency and stability. Especially in this area, the unintended ingenuity of human psychology to sabotage even the best of designs is a most formidable force to overcome.

Despite the best design, many problems lurk in its ensuing collection process. It is only when the human variation in the collection process is also minimized that data can be trusted enough so that you can take appropriate action on the process being improved.

Consider the following four questions:
• How often will the data be collected?
• Where will the data be collected?
• Who will collect the data?
• What training is needed for the data collectors?

To address these, one must consider the even deeper logistics of obtaining the data. For this, understanding the data collectors and their environment is crucial. Put yourself in a data collector’s position. Where is the best leverage point in the process to collect the data? Where will the job flow suffer minimum interruption?

As the improvement facilitator, consider: What effect will their normal job have on the proposed data collection? Is there a risk of data being incomplete? Are you asking for more data than are needed for the current objective? And another “human” issue: Will the collector’s perception of how often you want the data recorded be a barrier to cooperation?

Given all this, then: Who exactly should be the collector? Is this person unbiased, and does she have easy and immediate access to the relevant facts?

Don’t forget the logistics of recording the data

There still remains yet another most nontrivial issue: Exactly how will the data be recorded? Seriously consider involving some of the data collectors in the design of any data collection forms. Remember, a “check sheet” (data collection form) is one of the original seven basic improvement tools. Do the forms allow efficient recording of data with minimum interruption to people’s normal jobs?

To allow this, there are further issues for improvement facilitators to keep in mind:
• Reduce opportunities for error.
• Design traceability to collector and environment.
• Make the data collection form virtually self-explanatory, look professional, and keep it simple.
• Formally train all the people who will be involved. Have a properly completed data collection form available to use as reference.

As part of this training, reduce fear by discussing the importance of complete and unbiased information. Then be sure to answer the following natural questions of the collectors:
• What is the purpose of the study? What are the data going to be used for?
• Will the results be communicated to them?

And, finally, test the data collection process to expose and attempt to remove most remaining sources of inappropriate and unintended variation. Pilot the forms and instructions on a small scale.

After the (brief) test, revise the date collection processes if necessary using input from the collectors. Do the processes work as expected? Are the collection forms filled out properly? Do people have differing perceptions of operational definitions? Are the processes as easy to use as originally perceived?

When possible, sit with and observe the people collecting the data. And as new people enter the collection process, have someone who knows what to do watch the first attempts of novice data collectors.

You are now as ready as you’ll ever be to begin your formal data collection. But again, be forewarned: The human factor is always lurking—via W. Edwards Deming’s funnel experiment rule No. 4: unintentional “random walk” from original intentions. Throughout the collection, audit for missing data, unusual values, and possible bias. Occasionally observe the actual collection to continue improving measurement consistency and stability. And don’t be surprised by anything you observe.

Get the respect you deserve

People involved in the actual data collection should have confidence that you and your team know exactly what you are asking and looking for—and that you are actually going to do something with the information. People perceive their jobs as already taking up at least 100 percent of their time; they’re doing you a favor, so make it easy for them. Make sure the data collection results ultimately make their work lives easier as well. If you do, your future projects certainly will be easier, too.

Discuss

About The Author

Davis Balestracci’s picture

Davis Balestracci

Davis Balestracci is a past chair of ASQ’s statistics division. He has synthesized W. Edwards Deming’s philosophy as Deming intended—as an approach to leadership—in the second edition of Data Sanity (Medical Group Management Association, 2015), with a foreword by Donald Berwick, M.D. Shipped free or as an ebook, Data Sanity offers a new way of thinking using a common organizational language based in process and understanding variation (data sanity), applied to everyday data and management. It also integrates Balestracci’s 20 years of studying organizational psychology into an “improvement as built in” approach as opposed to most current “quality as bolt-on” programs. Balestracci would love to wake up your conferences with his dynamic style and entertaining insights into the places where process, statistics, organizational culture, and quality meet.

Comments

The Right Questions

The article lists valuable questions to answer to prepare for data collection. We pilot all improvements so it makes good sense to pilot and adjust data collection forms before full deployment. Thank you for helping us be more effective!

Good and concise comments

Well written, Davis. Your Four Phases and Eight Questions give a clear picture of effective data planning, gathering and use.