Cost for QD employees to rent an apartment in Chico, CA. $1,200/month. Please turn off your ad blocker in Quality Digest
Our landlords thank you.
Davis Balestracci
Published: Thursday, October 11, 2012 - 08:58 Have you ever been responsible for a data collection where any resemblance between what you designed and what you got back was purely coincidental? When that happens, yet again, I say to myself, “Well, it was perfectly clear to me what I meant.”
Consider the use of statistics as a data process, or rather, four processes: measurement, collection, analysis, and interpretation. Each of these has six sources of process inputs: people, methods, machines, materials, measurements, and environments. Each also has an inherent “quality” associated with it and is subject to the influences of outside variation to compromise this quality. Unless the variation in these processes is minimized, there is a danger of reacting to the variation in the data process instead of the process you are trying to understand and improve. What is the biggest danger? Human variation, which includes our perception of the variation (“measurement”) and our executing the “methods” of measurement, collection, analysis, and interpretation. In that context, let’s consider each of these four data processes: Are the data operationally defined with clear objectives? (W. Edwards Deming was fond of saying, “There is no true value of anything.”) Consider your plan for collecting these data: • How will the data be recorded? Another issue to consider in many cases: Do your customers or suppliers (both internal and external) collect the same kind of data? What procedures or instruments do they use? Are your definitions, standards, and procedures comparable to those used by customers and suppliers? Are you aware that your analysis should be known before one piece of data has been collected? When one applies a specific statistical technique, there is an underlying assumption that the data were collected in the specific way that makes the analysis appropriate. The danger here is that the computer will do anything you want—whether the data were collected appropriately or not. Imagine you have the data in hand: Did you know that statistics is a set of techniques used not to “massage” data but to more proactively interpret the variation on which you must take appropriate action? The danger: Any variation is one of two types, and treating one as the other actually makes the situation worse. To be continued in part two. Quality Digest does not charge readers for its content. We believe that industry news is important for you to do your job, and Quality Digest supports businesses of all types. However, someone has to pay for this content. And that’s where advertising comes in. Most people consider ads a nuisance, but they do serve a useful function besides allowing media companies to stay afloat. They keep you aware of new products and services relevant to your industry. All ads in Quality Digest apply directly to products and services that most of our readers need. You won’t see automobile or health supplement ads. So please consider turning off your ad blocker for our site. Thanks, Davis Balestracci is a past chair of ASQ’s statistics division. He has synthesized W. Edwards Deming’s philosophy as Deming intended—as an approach to leadership—in the second edition of Data Sanity (Medical Group Management Association, 2015), with a foreword by Donald Berwick, M.D. Shipped free or as an ebook, Data Sanity offers a new way of thinking using a common organizational language based in process and understanding variation (data sanity), applied to everyday data and management. It also integrates Balestracci’s 20 years of studying organizational psychology into an “improvement as built in” approach as opposed to most current “quality as bolt-on” programs. Balestracci would love to wake up your conferences with his dynamic style and entertaining insights into the places where process, statistics, organizational culture, and quality meet.Four Data Processes, Eight Questions, Part 1
Variations on a theme of process inputs
Measurement
• What is the concept you are trying to evaluate?
• What data will allow you to attach a value to this concept? By what standards or measures will you judge it?
• Can you write down clear descriptions of how to measure the characteristic?
—What are some factors that might cause measurements of the same item or situation to vary?
—In the case of measuring discrete events, is the threshold of occurrence between a “nonevent” (“0”) and an “event” (“1”) clear and understood?
—How can you reduce the impact of these factors?Collection
• Will the data collectors have to take samples?
—How often?
—How many?
• Can you design a data sheet (check sheet) to record the data as simply as possible?Analysis
• What could these data tell you?
• What will you do with the data? What specific statistical technique(s) will you use?
• Were the data collected in a way that makes this analysis appropriate?
• What will you do after that? Would another kind of data be more helpful?Interpretation
Our PROMISE: Quality Digest only displays static ads that never overlay or cover up content. They never get in your way. They are there for you to read, or not.
Quality Digest Discuss
About The Author
Davis Balestracci
© 2023 Quality Digest. Copyright on content held by Quality Digest or by individual authors. Contact Quality Digest for reprint information.
“Quality Digest" is a trademark owned by Quality Circle Institute, Inc.
Comments
4 x 8 processes & questions
Thank you Davis, i feel you're on your own road to Damascus, now: sooner or later you'll discover the Truth of Ed Rickett's Non Teleology, and that much of the Statistix' much troublesome hard work is mainly "hey, brothers, we're doing it for ourselves".
Ciao.