Featured Product
This Week in Quality Digest Live
Quality Insider Features
Donald J. Wheeler
What are the symptoms?
Graham Ward
Asserting yourself and setting clear boundaries
Henning Piezunka
Businesses and leaders influence the kinds of ideas they receive without even realizing it
NIST
Having more pixels could advance everything from biomedical imaging to astronomical observations
Chris Caldwell
Significant breakthroughs are required, but fully automated facilities are in the future

More Features

Quality Insider News
Study of intelligent noise reduction in pediatric study
Results produce high print quality, increased throughput
Providing practical interpretation of the EU AI Act
The move of traditional testing toward Agile quality management is accelerating
Easy to use, automated measurement collection
A tool to help detect sinister email
Funding will scale Aigen’s robotic fleet, launching on farms in spring 2024
3D printing technology enables mass production of complex aluminum parts
High-end microscope camera for life science and industrial applications

More News

Davis Balestracci

Quality Insider

Four Data Processes, Eight Questions, Part 1

Variations on a theme of process inputs

Published: Thursday, October 11, 2012 - 08:58

Have you ever been responsible for a data collection where any resemblance between what you designed and what you got back was purely coincidental? When that happens, yet again, I say to myself, “Well, it was perfectly clear to me what I meant.”

Consider the use of statistics as a data process, or rather, four processes: measurement, collection, analysis, and interpretation. Each of these has six sources of process inputs: people, methods, machines, materials, measurements, and environments. Each also has an inherent “quality” associated with it and is subject to the influences of outside variation to compromise this quality. Unless the variation in these processes is minimized, there is a danger of reacting to the variation in the data process instead of the process you are trying to understand and improve.

What is the biggest danger? Human variation, which includes our perception of the variation (“measurement”) and our executing the “methods” of measurement, collection, analysis, and interpretation.

In that context, let’s consider each of these four data processes:

Measurement

Are the data operationally defined with clear objectives? (W. Edwards Deming was fond of saying, “There is no true value of anything.”)
• What is the concept you are trying to evaluate?
• What data will allow you to attach a value to this concept? By what standards or measures will you judge it?
• Can you write down clear descriptions of how to measure the characteristic?
  —What are some factors that might cause measurements of the same item or situation to vary?
  —In the case of measuring discrete events, is the threshold of occurrence between a “nonevent” (“0”) and an “event” (“1”) clear and understood?
  —How can you reduce the impact of these factors?

Collection

Consider your plan for collecting these data:
• Will the data collectors have to take samples?
  —How often?
  —How many?

• How will the data be recorded?
• Can you design a data sheet (check sheet) to record the data as simply as possible?

Another issue to consider in many cases: Do your customers or suppliers (both internal and external) collect the same kind of data? What procedures or instruments do they use? Are your definitions, standards, and procedures comparable to those used by customers and suppliers?

Analysis

Are you aware that your analysis should be known before one piece of data has been collected?

When one applies a specific statistical technique, there is an underlying assumption that the data were collected in the specific way that makes the analysis appropriate. The danger here is that the computer will do anything you want—whether the data were collected appropriately or not.

Imagine you have the data in hand:
• What could these data tell you?
• What will you do with the data? What specific statistical technique(s) will you use?
• Were the data collected in a way that makes this analysis appropriate?
• What will you do after that? Would another kind of data be more helpful?

Interpretation

Did you know that statistics is a set of techniques used not to “massage” data but to more proactively interpret the variation on which you must take appropriate action? The danger: Any variation is one of two types, and treating one as the other actually makes the situation worse.

To be continued in part two.

Discuss

About The Author

Davis Balestracci’s picture

Davis Balestracci

Davis Balestracci is a past chair of ASQ’s statistics division. He has synthesized W. Edwards Deming’s philosophy as Deming intended—as an approach to leadership—in the second edition of Data Sanity (Medical Group Management Association, 2015), with a foreword by Donald Berwick, M.D. Shipped free or as an ebook, Data Sanity offers a new way of thinking using a common organizational language based in process and understanding variation (data sanity), applied to everyday data and management. It also integrates Balestracci’s 20 years of studying organizational psychology into an “improvement as built in” approach as opposed to most current “quality as bolt-on” programs. Balestracci would love to wake up your conferences with his dynamic style and entertaining insights into the places where process, statistics, organizational culture, and quality meet.

Comments

4 x 8 processes & questions

Thank you Davis, i feel you're on your own road to Damascus, now: sooner or later you'll discover the Truth of Ed Rickett's Non Teleology, and that much of the Statistix' much troublesome hard work is mainly "hey, brothers, we're doing it for ourselves".

Ciao.