Featured Product
This Week in Quality Digest Live
Health Care Features
Etienne Nichols
How to give yourself a little more space when things happen
Chris Bush
Penalties for noncompliance can be steep, so it’s essential to understand what’s required
Jennifer Chu
Findings point to faster way to find bacteria in food, water, and clinical samples
NIST
Smaller, less expensive, and portable MRI systems promise to expand healthcare delivery
Lindsey Walker
A CMMS provides better asset management, streamlined risk assessments, and improved emergency preparedness

More Features

Health Care News
Showcasing the latest in digital transformation for validation professionals in life sciences
An expansion of its medical-device cybersecurity solution as independent services to all health systems
Purchase combines goals and complementary capabilities
Better compliance, outbreak forecasting, and prediction of pathogens such as listeria or salmonella
Links ZEISS research and capabilities in automated, high-resolution 3D imaging and analysis
Creates one of the most comprehensive regulatory SaaS platforms for the industry
Resistant to high-pressure environments, and their 3/8-in. diameter size fits tight spaces
Easy, reliable leak testing with methylene blue
New medical product from Canon’s Video Sensing Division

More News

William A. Levinson

Health Care

Medical Tragedy Can Be Easily Prevented by Error Proofing

Ford’s “Can’t Rather Than Don’t” principle is applicable to any industry

Published: Tuesday, July 19, 2011 - 11:11

Dr. Gary Brandeland’s article, “The Day Joy Died,” which appeared in the Oct. 20, 2006, edition of Modern Medicine, underscores the primitive nature of quality thinking—and more specifically, safety thinking—in hospitals. Although I’m not going to give formal engineering advice about medical devices I’ve not actually seen, it’s clear from the narrative that the death of this patient along with serious injury to her unborn child could have easily been prevented by rudimentary quality and safety thinking. I refer specifically to Henry Ford’s “Can’t Rather Than Don’t” safety principle, which is also useful to any organization working with OSHAS 18001.

“Can’t Rather Than Don’t” refers to a kind of error-proofing that prevents rather than warns. For example, instead of warning signs and instructions that tell workers, “Don’t put your hand in the mechanical press while it closes,” the press itself is designed so the worker can’t put his hand into it when it closes. Ford’s presses achieved this by requiring each operator to press and hold down two switches to close it. A two-worker press required its operators to hold down four switches, which made it impossible for any of the four hands involved to be anywhere near the danger area. Ford, and then Shigeo Shingo, proved that the same principle applies to quality through poka-yoke or error proofing. The principle has been around for at least 90 years, and the ongoing failure of hospitals to implement it in life-critical applications—i.e., a 10 rating for severity on a failure mode and effects analysis’ (FMEA) 1–10 ranking scale—is inexcusable.

Brandeland reports, “The patient was brain dead, the result of an anesthesia catastrophe. In preparing her for her C-section, the nurse anesthetist had accidentally intubated the esophagus and failed to put a pulse oximeter alarm on her.” The second sentence underscores the general principle that most catastrophic or fatal accidents require more than one thing to go wrong, and this can be illustrated on a fault tree analysis (FTA). The FTA for this outcome would have an AND gate with four elements:

“1. Nurse anesthetist puts breathing tube in esophagus instead of windpipe AND
2. Ventilator does not detect and announce the unusual condition via an alarm or similar means AND
3. Pulse oximiter alarm is not connected AND
4. Pulse oximiter does not announce its disconnected status (which should be obvious from a measurement of zero) via an alarm

Healthcare Information and Management Systems shows that off-the-shelf error proofing is, in fact, available for ventilators. It states explicitly, ‘Error: esophageal intubation (putting a tube into a patient’s stomach which was intended for their lungs). Error Proofing: Squeeze bulb and put on tube. If bulb inflates, the tube is in the lungs. If not, tube is incorrectly placed in the esophagus.’”

In the Agency for Healthcare Research and Quality’s (AHRQ) Mistake-Proofing the Design of Health Care Processes, the AHRQ discusses the bulb and adds the obvious idea of detection of carbon dioxide. Lungs exhale carbon dioxide while the stomach does not, so if the patient does not return carbon dioxide, the tube is in the wrong place. “The bag in Figure 5.18 has a detector that changes color in the presence of carbon dioxide,” states the report. “If the detector fails to change color, then the tube is in the esophagus. If it changes color, the tube is in the trachea. If the mistake-proofing device fails, the device will indicate a situation requiring corrective action.”

The report further clarifies this, stating, “Esophageal intubation is a common error that occurs when the intubation tube is inserted in the patient’s esophagus instead of in the trachea.” From an FMEA perspective, the failure not only has a maximum severity rating, but it also occurs frequently! Error proofing can make detection of this failure almost certain (D = 1 on a 1–10 FMEA scale) and therefore prevent nearly all fatalities, despite this failure mode’s high severity and probability. Failure to implement these technologies is therefore inexcusable.

AHRQ’s report adds evidence of medical error proofing that dates back to the 19th century, and fatalities from administration of high concentration heparin instead of low concentration heparin come to mind immediately: “Bottles of poison are variously identified by their rectangular shape, blue-colored glass, or the addition of small spikes to make an impression on inattentive pharmacists.” The report includes many other examples of error-proofing medical processes to make mistakes effectively impossible, and it is worth detailed study by hospitals and similar health care providers.

Discuss

About The Author

William A. Levinson’s picture

William A. Levinson

William A. Levinson, P.E., FASQ, CQE, CMQOE, is the principal of Levinson Productivity Systems P.C. and the author of the book The Expanded and Annotated My Life and Work: Henry Ford’s Universal Code for World-Class Success (Productivity Press, 2013).

Comments

Your piece clearly outlines

Your piece clearly outlines an all-too-common practice and mindset of don't change anything until forced to by lawsuit or death.  I've been witness to this in industry as well.  It leaves me with palms up and head shaking...why? Even tractors have built in fail safes. I cannot fathom why healthcare practitioners would deem it unnecassary. Sometimes Bill, I wish your writing was just a little less clear and well thought out 'cause this one makes me feel pretty low.

Leadership

The qualty problems in healthcare are not technical or scientific issues... they are leadership issues, namely a lack of serious leadership and effort around error proofing and quality improvement.

Healthcare still relies far too often on lecturing people to "be careful" and individuals are punished when things go wrong. 

We know how to fix these problems, but people just don't.