PROMISE: Our kitties will never sit on top of content. Please turn off your ad blocker for our site.
puuuuuuurrrrrrrrrrrr
Davis Balestracci
Published: Friday, May 18, 2012 - 15:48 “I’m shocked... shocked to find that gambling is going on in here!” —Casablanca’s Captain Renault, as he’s closing down Rick’s Cafe... while being handed his gambling winnings I saw an abstract of a recent talk by several “experts” who have been very active selling (expensive) improvement initiatives during the last 5–10 years. They do this via lots of training, tools, “sharing best practices,” and exhorting people with, “If they can do it, you can do it.” Meanwhile they are creating a massive subculture of qualicrats. I was amused when I read the abstract (below): Shocking! More than 15 years ago, A. Blanton Godfrey, the former chairman and CEO of Juran Institute, wrote a column for Quality Digest about the results of an earlier study done by his colleague, John Early, titled, “Why It Takes So Long,” which investigated why several leading organizations’ quality improvement teams took so long to get results—many teams were taking more than a year to complete a project. Godfrey wrote, “Since then, we have reviewed several other companies and found the delays remarkably similar. The time spent on quality improvement projects breaks down into about one-third useful time and two-thirds wasted time. Management accounts for well over half of the excess time.” When his colleague did a Pareto diagram of the actual delays due to different causes, they found: Godfrey’s solution? Blitz teams. Could his article have been written yesterday? Many of you have learned and maybe even teach the Pareto chart. But how many of you apply the Pareto principle, also known as the 80/20 rule to your potential work? It was a particular favorite of 20th-century quality giant, Joseph Juran. As my respected colleague, Jay Arthur, notes, “Use the 4-50 rule. In case after case working with various businesses, I have found that less than 4 percent of your business causes more than 50 percent of the waste, rework, cost, and lost profit. So forget the old 80/20 rule. Narrow your focus even further to maximize your gains and minimize your Six Sigma startup costs. Only involve 4 percent of your staff in the initial wave of improvements focused on mission-critical elements of your business.” Arthur also warns, “Remember the dark side of the 4-50 rule—50 percent of your effort produces only 4 percent of the benefit.” Brian Joiner similarly warns, “Vague solutions to vague problems yield vague results.” Research by the Juran Institute concluded that many projects fail for two major reasons: Regarding No. 1: Why too much flowcharting? Because you need to apply the Pareto principle one step further on a chosen situation: What is the 20 percent of that chosen (i.e., vague) process causing 80 percent of the problem? Then do a detailed flowchart showing exactly how things get done and by whom in that specific area, which also considerably narrows the brainstorming if an Ishikawa cause-and-effect diagram is used as well. So, back up a bit and start with what Joiner and fellow co-authors of The TEAM Handbook (Oriel Inc., 1988) call a top-down flowchart (aka process map): Break up your situation or process into five to seven subprocesses and document only the specific tasks that occur at each stage—not how they get done. (If complete documentation is important for, say, a certification, the additional detail can be eventually added post-project after the major issue is solved.) The top-down flowchart can identify possible data collection points or traceability markers where a well-designed stratification will usually help isolate the “vital 20 percent.” Regarding No. 2: As I’ve demonstrated time and again, an initial time plot—i.e., run chart—of a crucial indicator showing the extent of the problem should even precede all the flowcharting… and it would be best if this indicator somehow cascades to somehow connect to a “big dot” in the boardroom. Most health care organizations have a tailor-made opportunity for this: safety. New results = new conversations, as Donald Berwick, M.D., describes so well in one of my favorite quotes: How many of you spend a lot of time delivering internal education and training seminars? So tell me: How are they all contributing to a “big dot” in the boardroom? Do your execs still demand to know what the “payback” is for all this training, and threaten it with the first cutbacks when cost-cutting season (inevitably) arises? Well, then: Stop it all right now! Here’s a nice analogy from the wise Jim Clemmer: (Have you considered your efforts in this context?) So, why not use the approach of connecting all improvement activity to organizational “big dots” as a way of piloting a new start or redesign of your overall organizational improvement education process—with the major initial goal of improving your improvement process? Initially, it would seem tempting to put a separate team on each aspect of the organizational “big dot” chosen (in the case of health care safety, this would have in one instance resulted in separate teams addressing complaints, medication errors, falls, pressure sores, and bacteraemias—five teams total). Consider this initial one-team approach on the biggest opportunity “little dot”: Some of the technical and improvement knowledge will no doubt spill over into the companion “little dots,” making any subsequent projects far more effective. With this approach there’s a chance to teach run charts (e.g., construction and interpretation, common and special cause, baseline); flowcharting (top down, detailed, deployment, value-add); data issues (operational definitions, design of a good data collection process); and stratification to further isolate the major sources of variation within your chosen topic—all within the context of solving a major organizational problem with a high probability of success. Of course, there are then the solution implementation issues, what Juran recognized as “the remedial journey,” with its basis more in cultural psychology and sociology—to which the “experts” quoted above were clueless. To me, this is far more interesting than a red bead experiment or flowcharting the process of making coffee. Or, as I saw once, a cause-and-effect diagram for improving the baking of chocolate chip cookies. As Juran liked to say, “There is no such thing as ‘improvement in general’,” and you should realize that improving your improvement process to get better faster is part of your charter as change agents. A friendly warning via one of my 10 Commandments for Change Agents: Quality may be very interesting to thee, but thee must realize that thy neighbor’s job takes up over 100 percent of his time.” As you’ve heard me say many times, don’t teach people statistics. Teach them how to solve their problems and more important, stop boring them to death! Developing a reputation for getting results will also get you that seemingly elusive respect you deserve. Are you, like me, ready to say, “Enough!” and do something about it? Then join me. I’d love to be able to say (once again, courtesy of Casablanca), “I think this is the beginning of a beautiful friendship.” Quality Digest does not charge readers for its content. We believe that industry news is important for you to do your job, and Quality Digest supports businesses of all types. However, someone has to pay for this content. And that’s where advertising comes in. Most people consider ads a nuisance, but they do serve a useful function besides allowing media companies to stay afloat. They keep you aware of new products and services relevant to your industry. All ads in Quality Digest apply directly to products and services that most of our readers need. You won’t see automobile or health supplement ads. So please consider turning off your ad blocker for our site. Thanks, Davis Balestracci is a past chair of ASQ’s statistics division. He has synthesized W. Edwards Deming’s philosophy as Deming intended—as an approach to leadership—in the second edition of Data Sanity (Medical Group Management Association, 2015), with a foreword by Donald Berwick, M.D. Shipped free or as an ebook, Data Sanity offers a new way of thinking using a common organizational language based in process and understanding variation (data sanity), applied to everyday data and management. It also integrates Balestracci’s 20 years of studying organizational psychology into an “improvement as built in” approach as opposed to most current “quality as bolt-on” programs. Balestracci would love to wake up your conferences with his dynamic style and entertaining insights into the places where process, statistics, organizational culture, and quality meet.Shocked? Not Really
Why we should solve the problems that matter
“Despite best intentions and commitment, why don’t improvement initiatives always yield hoped-for results? It’s much harder to spread successes in one setting to another even if, on the surface at least, the settings seem pretty similar. What can help this adaptation? How can problems and flaws in the design of the work be detected sooner? One underlying theme is the need to integrate a robust learning and evaluation system for every step of the way right from the start. [The presenters] will discuss recent situations in which ‘learning the hard way’ has given rise to much better and clearer-eyed processes for the future.”Someone (me, too) already figured this out
• Nearly six weeks were lost due to team members being unable to dedicate sufficient time to the project.
• Senior executives added another five-and-a-half weeks of delay by failing to confront resistance to changes implied by the solution or even resistance in providing data, analysis time, and access to needed information.
• The lack of preexisting measurement criteria added another five weeks. Many teams were forced to develop the needed measures first and then collect the required data. Often, they then had to “sell” the measurement to the management team.
• On average, teams wasted another four-and-a-half weeks by starting out with a vague, debatable mission. They frequently had to rewrite the mission statement and meet with the senior management team again and again until they had a specific, measurable, acceptable mission.
• Many teams wandered off course rather than focusing on vital symptoms, causes, and solutions.Solve problems that matter
Two traps
1. Too much detailed flowcharting—which can also result in a potential Ishikawa “cause and effect diagram from hell”
2. No good baseline estimate of the problem
“Several important things happen when you plot data over time. First, you have to ask what data to plot. In the exploration of the answer, you begin to clarify aims, and also to see the system from a wider viewpoint. Where are the data? What do they mean? To whom? Who should see them? Why? These are questions that integrate and clarify aims and systems all at once…. When important indicators are continuously monitored, it becomes easier and easier to study the effects of innovation in real time.”Improve your improvement process
“Many organizations induce learned helplessness. It’s like the strange pumpkin I once saw at a county fair. It had been grown in a four-cornered Mason jar. The jar had since been broken and removed. The remaining pumpkin was shaped exactly like a small Mason jar. Beside it was a pumpkin from the same batch of seeds that was allowed to grow without constraints. It was about five times bigger. Organization structures and systems have the same effect on the people in them. They either limit or liberate their performance potential.”
Our PROMISE: Quality Digest only displays static ads that never overlay or cover up content. They never get in your way. They are there for you to read, or not.
Quality Digest Discuss
About The Author
Davis Balestracci
© 2023 Quality Digest. Copyright on content held by Quality Digest or by individual authors. Contact Quality Digest for reprint information.
“Quality Digest" is a trademark owned by Quality Circle Institute, Inc.
Comments
Skip to Analyze, Improve and Control
Most companies already have too much data, so skip the Define and Measure phases of DMAIC, go right to Analyze, Improve and Control. If you can't find data for the problem you want to solve, skip that problem and start with data about problem you can solve. Too many teams try to define new measures and try to collect them, delaying problem solving and results.
There's more than enough data laying around about defects and deviation to drive improvement teams for years. Start there. Draw a control chart. Narrow your focus with a couple of pareto charts or histograms. Determine the root causes and fix the 4% that's causing 50% of the mistakes, errors, defects and lost profit. Repeat until you run out of things to fix, then start identifying better measurements.
With existing data, analysis of control charts and pareto charts can be done in a day or two. Then gather a SWAT team of experts to do root cause analysis. Then project manage the implementation of countermeasures and measure results. Properly focused, improvements can happen in a day or two. Improperly focused they wander around for months with no results.
It's not hard: