Featured Product
This Week in Quality Digest Live
Six Sigma Features
Donald J. Wheeler
What are the symptoms?
Douglas C. Fair
Part 3 of our series on SPC in a digital era
Scott A. Hindle
Part 2 of our series on SPC in a digital era
Donald J. Wheeler
Part 2: By trying to do better, we can make things worse
Douglas C. Fair
Introducing our series on SPC in a digital era

More Features

Six Sigma News
How to use Minitab statistical functions to improve business processes
Sept. 28–29, 2022, at the MassMutual Center in Springfield, MA
Elsmar Cove is a leading forum for quality and standards compliance
Is the future of quality management actually business management?
Too often process enhancements occur in silos where there is little positive impact on the big picture
Collect measurements, visual defect information, simple Go/No-Go situations from any online device
Good quality is adding an average of 11 percent to organizations’ revenue growth
Floor symbols and decals create a SMART floor environment, adding visual organization to any environment
A guide for practitioners and managers

More News

Davis Balestracci

Six Sigma

The Sobering Reality of ‘Beginner’s Mind’

‘It only has to average 100%’

Published: Thursday, June 28, 2012 - 09:44

I am in the midst of teaching an online MBA course in statistical thinking. This is actually my second go-round, and I've heavily revised my inherited materials, which were well-meaning but had some obvious gaps.

ADVERTISEMENT

I insisted on using Brian Joiner’s Fourth Generation Management (McGraw-Hill) as the key text, and I still use Donald Wheeler’s excellent classic, Understanding Variation  (SPC Press, 1993), and W. Edwards Deming’s own The New Economics (MIT Press, 2000 reprint), probably his most readable, which were the two main texts as originally taught.

A key element of the course is weekly online dialogue amongst the students via two challenging discussion questions. The very first week, I base one dialogue question on Thomas Nolan’s and Lloyd Provost’s seminal paper, “Understanding Variation” (Quality Progress, May 1990). If you’ve never read it, and education is part of your job, read it. It is an eye-opener—for students and myself—to realize that despite the fact that this paper was written more than 20 years ago, these basic concepts so crucial to 21st-century management aren’t even close to being mainstream.

It’s been quite the challenge for me to remember “beginner’s mind.’’

Here's an example. One aspect of the course that needed major revision was how to teach the Individual’s control chart—aka an I-Chart, or Wheeler’s “Process Behavior Chart.” It’s been four weeks now, and I’ve explained the I-Chart in many, many different ways using many scenarios and contexts. Aside from the math anxiety, which is considerable, even my best, most intuitive students still struggle with basic interpretation issues. Try as I might, it is very difficult to get them beyond the “one point beyond three standard deviations” test and the tendency to treat each generated special cause test as, well, a special cause needing unique explanation.

I want to make it clear that I am not blaming the students. It makes me think of the thousands of “tools” seminars being taught, which always seems to end with the dreaded half-day of control charts (usually all seven of them).

The percent-computer-information-system uptime monthly meeting

Here is the scenario I give them for the week-three dialogue (based on a true story). I use it to teach the math of constructing a control chart, then, and more important, have them discuss what the chart means and what action they should take:

A medical center’s Harvard MBA COO insisted on nothing less than 100-percent computer uptime. No excuses.

Here are the last 19 months of performance:

Photobucket

In fact, he had rewarded folks for the “good work” they attained in June 1998 by saying, “Send out for pizza, and send me the bill!”

But, alas, you know what happens when you reward people—as evidenced by the ensuing four months of an alleged “disturbing” trend. So, no more Mr. Nice Guy. He now presides over a monthly meeting to discuss, in detail, the reasons for downtime that month. It’s obviously working because the month after the alleged trend, they got 100-percent uptime again. No pizza this time. And once again, it reinforces that “getting tough with accountability” is true leadership.

So now you have yet another dreaded monthly “account for results” meeting on your schedule, to which you can almost write the script for the litany of excuses. Usually it starts off with a display similar to the top two graphs of the following figure:

Photobucket

Then there are the conversations like, “Yes, we went from 99.7 percent to 98.6 percent, but there’s a good reason. You have to understand, we didn’t expect (fill in the blank) to happen. But here’s my plan for making it better next month.”

So, I present the class with a run chart of the data (bottom graph, above) and ask:

1. Is that period of June 1998–October 1998 a statistical trend?
No. It is only four successive decreases, and five are needed (with < 20 data points) to be called a trend.

2. Do you see a cluster of eight consecutive points either all above the median or all below the median?
One would hope that improvement was occurring during this time. If so, one might see eight recent points above the median, or eight points early in the data below the median. In this case, no.

3. Based on 1 and 2, so far, is the process behavior common or special cause? What strategy have they been using during the last 19 months?
So far, the behavior is common cause, and the monthly meeting to discuss, specifically, that month’s downtime is a special cause strategy. As Dr. Phil would say, “How’s that workin’ for ya?” There is no evidence of improvement—but do you think some complexity might have been added?

4. Obtain the I-Chart for these data. Based on the information so far, how should they proceed?
That’s a whole other article.

The concept of “specification” for service industries

There is a major point that is lost on many MBA students. Despite the excellence of Wheeler’s book, it is still somewhat manufacturing-based. His concept of “voice of the process” vs. “voice of the customer” (using manufacturing conformance-to-specification examples) is difficult for the students to grasp and, based on my observation of the weekly dialogues, almost totally lost on them. The concept does indeed apply, but in the following context for management and service industries. In week three’s lecture notes, I make this point:

“IMPORTANT: A lot of this week’s reading is about ‘specifications’ and based in a manufacturing mindset. This is going to be very difficult for most of you. I started my career in manufacturing, but I’ve also made the transition to ‘service’ (health care). For a service-industry management process, the best analogy I can make is in looking at someone’s ‘compliance’—using the term very loosely—to a work process: Did they get a desired result or not? If they did, they ‘met specification,’ and if they didn’t, their performance ‘didn’t meet specification.’ So, the deeper question is, ‘Is the fact that a process—in terms of overall organizational performance—doesn’t “go right” 100-percent of the time a common or special cause?’ That is, did it meet specification all the time? And how is each ‘noncompliance’ treated—as a common or special cause?”

I’d like to offer a graph for my health care readers to ponder. The health insurance industry is also “getting tough” (and might even be full of Harvard MBAs). Many have started a policy that, if certain things happen during a hospital stay that they feel shouldn’t have occurred (these are known as “never events”), the hospital will not get reimbursement for the additional expenses incurred to treat it.

So, here is a graph, by quarter, of “never” events for a fictitious hospital:

It’s “obvious” that they’re capable of getting zero events, so why can’t they just try a little harder and do it all the time?

Still not hitting home with some of you? All right: Can you think back to the No Child Left Behind Act? What could be simpler? Are you as outraged as the breast-beating politicans who say, “This ‘shouldn’t’ happen in America!” So, let me ask the government and education bureaucrats about their tough emphasis on accountability for this: “How’s it workin’ for ya?” (Readers: Have you heard just about all the excuses?) Ever heard the expression, “Your current processes are perfectly designed to get the results they’re already getting...” even if they ‘shouldn’t?’”

Let me leave you with a very amusing true story. I had a friend whose manager insisted on 100-percent performance—not in a spirit of improvement. When my friend gave a reasonable explanation for why that wasn’t realistically possible, the manager said, “No, no, no, that’s OK. It only has to average 100 percent.” We still let people get six-figure salaries by being “tough” in reacting to the fact that one number is larger (or smaller) than another. And that outrages me.

As I try to make clear to my students, they are swimming in everyday opportunity.

Discuss

About The Author

Davis Balestracci’s picture

Davis Balestracci

Davis Balestracci is a past chair of ASQ’s statistics division. He has synthesized W. Edwards Deming’s philosophy as Deming intended—as an approach to leadership—in the second edition of Data Sanity (Medical Group Management Association, 2015), with a foreword by Donald Berwick, M.D. Shipped free or as an ebook, Data Sanity offers a new way of thinking using a common organizational language based in process and understanding variation (data sanity), applied to everyday data and management. It also integrates Balestracci’s 20 years of studying organizational psychology into an “improvement as built in” approach as opposed to most current “quality as bolt-on” programs. Balestracci would love to wake up your conferences with his dynamic style and entertaining insights into the places where process, statistics, organizational culture, and quality meet.