Featured Product
This Week in Quality Digest Live
Innovation Features
MIT News
Two-part transaction would turn edX into a public benefit company while funding a nonprofit dedicated to strengthening the impact of digital learning
Matt Fieldman
In addition to attraction and recruitment, U.S. manufacturers must also focus on keeping and cultivating the workers they already have
Zach Winn
Boston Micro Fabrication uses a novel light-focusing method to make ultraprecise printers
Katherine H. Freeman
Virtual meetings are cheaper, more accessible, and better for the planet

More Features

Innovation News
Both quality professionals and their business leaders agree that openness and communication is essential to moving forward
Voxel8 patented technology will provide printed lattice structures to be used as inserts in midsoles
Purpose-built for cannabis analysis
True 3D holographic displays are practical with only moderate computational requirements
Inspect nozzle welds using phased array ultrasound testing techniques including ray-tracing, scanner simulation, coverage maps
Produce large parts up to 300 × 300 × 450 mm without residual stress, gas cross flow, or having to pre-sinter powder bed
Interfacial launches highly filled, proprietary polymer masterbatches
‘Completely new diagnostic platform’ could prove to be a valuable clinical tool for detecting exposure to multiple viruses

More News

Gleb Tsipursky


How to Gather Quality Data to Inform Truly Effective Decision Making

Organizations of all sorts suffer from bad information-gathering processes when developing plans for major projects

Published: Wednesday, November 18, 2020 - 13:03

Does the phrase “garbage in—garbage out” (GIGO) ring a bell? That’s the idea that if you use flawed, low-quality information to inform your decisions and actions, you’ll end up with a rubbish outcome. Yet despite the popularity of the phrase, we see such bad outcomes informed by poor data all the time.

In one of the worst recent business disasters, two crashes of Boeing’s 737 Max airplane killed 346 people and led to Boeing losing more than $25 billion in market capitalization as well as more than $5 billion in direct revenue. We know from internal Boeing emails that many Boeing employees in production and testing knew about the quality problems with the design of the 737 Max; a number communicated these problems to the senior leadership.

However, as evidenced by the terrible outcome, the data collection and dissemination process at Boeing failed to take in such information effectively. The leadership instead relied on falsely optimistic evidence of the safety of the 737 Max in their rush to compete with the Airbus A320 model, which was increasingly outcompeting Boeing’s offerings.

Garbage in equals hundreds of lives lost and many billions of dollars.

Of course, other sectors experience rubbish outcomes from bad data all the time, not only business. Consider the U.S. Centers for Disease Control’s notorious failure in providing testing kits for Covid early on in the pandemic. During the initial stages of the pandemic, testing patients for Covid required sending samples to the CDC, which greatly slowed testing, isolation, and contact tracing. Then, on Feb. 5, 2020, the CDC manufactured and shipped 50,000 testing kits to hundreds of testing sites around the United States to do testing locally and get results much faster. However, the kits turned out to be poor quality, providing unreliable data. As a result, samples had to be sent to the CDC for the next three weeks, until the CDC replaced the bad kits with good ones.

The consequence? Covid spread widely undetected around the United States. Many more thousands of people caught the disease needlessly due to such bad data, with deadly consequences. Arguably, the subsequent lockdowns lasted longer and were more severe than they would have been with more timely and accurate testing, all of which resulted in many billions of dollars lost, millions of workers laid off, and thousands of small businesses going bankrupt.

Such case studies of GIGO are backed up by systematic research. For example, organizations of all sorts suffer from bad information-gathering processes when developing plans for major projects by failing to prepare sufficiently for risks and problems. As a result, a 2002 study of large construction projects found that 86 percent went over budget. In turn, a 2014 study of major IT projects found that only 16.2 percent succeeded in meeting the original planned resource expenditure. Of the 83.8 percent of projects that did not, the average IT project suffered from a cost overrun of 189 percent.

As an expert in the cognitive neuroscience and behavioral economics of risk management, decision making, and strategic planning, I’ve personally seen and studied research by others about our consistent failure patterns in gathering quality data to inform decisions, whether that’s for innovation, compliance, or other matters. For example, we tend to look for information that confirms our beliefs and ignores contradictory evidence; we don’t give due consideration to other people’s motives, feelings, and incentives for sharing or withholding information; we feel too confident that our plans will go well, and fail to guard against risks and problems sufficiently.

Scholars in cognitive neuroscience, psychology, and behavioral economics like myself call these mental blind spots “cognitive biases.” They result from a combination of our evolutionary background and specific structural features in how our brains are wired. These mental blind spots impact all areas of our life, from health to politics and even shopping.

In the upcoming Nov. 19, 2020, webinar, “How Quality Professionals Should Gather High-Quality Data to Inform Truly Effective Decision Making via Neuroscience,” I will discuss how to combat these biases in order to make fact-based decisions.

How cognitive biases undermine quality data collection

My experience shows that three cognitive biases bear the biggest responsibility for undermining quality data collection, leading to poor decisions and systematic risk management failures.

Planning fallacy
We need to watch out for the cognitive bias known as the planning fallacy, our intuitive belief that everything will go according to plan, whether in IT projects or in other areas of business and life. You probably heard the business advice of “failing to plan is planning to fail.” That phrase is a misleading myth at best and actively dangerous at worst. Making plans is important, but our gut reaction is to gather information that points to best-case outcomes, ignoring the high likelihood that things will go wrong.

This tendency explains the failures of major construction and IT projects, and so many other issues. A much better phrase is “failing to plan for problems is planning to fail.” To address the very high likelihood that problems will crop up, you must plan for contingencies.

Here’s how a fallacy-based data-collection error led to a series of poor decisions. A major software company (that I won’t name) was blindsided by its biggest competitor putting out a new, advanced product. The company’s competitor was praised in the industry press as a cutting-edge leader and received positive online reviews on software evaluation websites. Some of the company’s customers started to consider switching to the competitor.

As a result, the company rushed to put out a new and even more advanced software product. The development team felt confident in its own ability, looked for information that supported its beliefs, and cut corners in checking quality. To meet a very ambitious launch date, company leadership pressured the quality control team to approve the product, despite protests of inadequate testing.

Unfortunately, the consequences proved predictable. The product launched with numerous bugs that left many customers frustrated and angry, resulting in many more switching to the competitor and bad industry write-ups and online reviews. The company suffered many millions in losses of direct revenue and much more in reputation and market share; the high-flying executive in charge of the new product was demoted to a dead-end position and eventually left the company.

Such competitive pressures—similar to Boeing’s desire to catch up to Airbus—often result in poor information-gathering that ignores evidence of problems. It’s when you feel like you really need to speed up that you should slow down and check thrice for data on risks and threats to avoid disastrous decisions.

Confirmation bias
A second, critically important mental blind spot is confirmation bias. Our brains tend to ignore or forget evidence that runs counter to our current perspective, and will even twist ambiguous data to support our viewpoint and confirm our existing beliefs. We also specifically look for information to support our current viewpoint, and value it much more highly than other data.

The stronger we feel about an issue, the stronger this tendency. At the extreme, confirmation bias turns into wishful thinking, when our beliefs stem from what we want to believe instead of what is true.

Confirmation bias reaches the very tops of organizations. A four-year study, which interviewed 1,087 board members from a diverse group of 286 organizations that had forced out their chief executive officers, found that almost one-quarter of CEOs—23 percent—were fired for failing to recognize negative facts about the organization’s performance. Other research shows denial of negative information happens at all levels within a company.

Economic downswings often reveal such denialism; they result in the removal of top leaders or the crashing down of fundamental assumptions within a company. Here’s an example. The production team for a high-tech equipment manufacturing startup strongly believed that its focus should always be on innovation to improve quality. They had good reason to believe that because the main decision makers for purchasing their products were production managers, who demanded the company’s high-quality, cutting-edge—and consequently comparatively expensive—products. The company’s marketing and sales centered on that reputation for quality, and they had a seemingly secure market niche.

However, the company’s leadership did not consider and gather information on what might happen in less flush economic times. The Covid pandemic led to a major recession, with decision-making in the company’s clients increasingly shifting from production managers to accountants. The CFOs and their staff at client companies looked at every element of their spending to ensure it was justified, pushing production managers to prove return on investment for anything above the lowest possible bidding price.

Yet the manufacturing company, in focusing on innovation and quality, failed to investigate the ROI from its products. It certainly had the capacity to gather data on the way its products were more durable and efficient than lower-cost, lower-quality competitors; however, it never chose to invest its resources to do so. The result: It lost significant market share to such competitors, as production managers, with apologies and regrets, chose to buy the inexpensive products.

At first reluctant to acknowledge this issue, the company eventually launched a strategic pivot a few months into the pandemic. It initiated studies to prove its products were more than worth it in the long run; it also reinvested its product design and production capacities from innovation to designing ROI demonstration into the products themselves. Yet this endeavor, slated to take many months and millions of dollars, came too late to stave off many millions of dollars in lost revenue and serious deterioration of market share.

Empathy gap
The third biggest cognitive bias for gathering quality data is the empathy gap. We usually underestimate the extent and power of other people’s emotions, especially those who we don’t perceive to be part of our in-group. It’s not surprising: In work contexts, we’re supposed to focus on logic and reason, right? Well, the reality is that emotions powerfully influence our decisions, with up to 95 percent of our decision making stemming from our emotions. Unfortunately, organizations fail to gather nearly enough information about the emotions of people involved in and influenced by decisions, whether internal stakeholders such as employees, or external ones such as clients.

Consider the case of a midsize pharmaceutical company that decided to update its strategic plan. The seven-member C-suite gathered for a two-day strategic retreat. They argued, debated, and finally coalesced around a five-year plan. Next, they conveyed it to their teams.

Unfortunately for them, the strategic plan got immediate push-back, both overtly from their team members, but much more covertly through the company grapevine. Many staff felt upset and frustrated that they weren’t consulted prior to the leadership team developing their plan.

Now, employees made some logically valid points. The feedback by employees quickly proved some aspects to be nonviable, whether due to excessively ambitious timelines, shifting market conditions, or anticipated countermoves by competitors.

However, just as problematically, the employees felt their opinions didn’t matter, that the leaders simply saw them as cogs in the machine. Shortly after the unveiling of the strategic plan, surveys showed a sharp drop in morale and engagement, decreasing productivity, and growing attrition.

After four months of stubbornly trying to stick to their strategic plan, with minor revisions for the strongest logical arguments, the C-suite gave in. They threw their shiny, new strategic-planning folders into the dumpster and began a much more thorough, companywide information-gathering process to inform a revised strategic plan.

Failure-proof your information gathering to make and implement decisions effectively

Learning about these and related cognitive biases, and watching out for them is one part of ensuring you gather top-quality data. You can also deploy an eight-step technique called “failure-proofing” that addresses the large majority of mental blind spots at once, and maximizes the likelihood of making and implementing major decisions effectively. By using this eight-step process, you tap the most important practices of debiasing, the scientific field of defeating cognitive biases to make the best decisions.

Step 1: Gather
Gather all the people relevant for making the decision in the room, or representatives of the stakeholders if there are too many to have in a group. If the decision is serious enough, recruit an independent facilitator who is not part of the team to help guide the exercise.

Step 2: Explain
Explain the failure-proofing process to everyone by describing all the steps, so that all participants are on the same page about the exercise.

Step 3: Next best alternatives
Then, develop your default decision option, implementation plan, and two next-best alternatives (NBAs). Evaluate whether the NBAs seem more appealing than the default option, and consider whether to include any aspects of the NBAs into a revised plan.

Step 4: Imagine the decision failed
Next, ask all the stakeholders to imagine that they are in a future where the decision and its implementation definitely failed. Doing so gives permission to everyone, even the biggest supporters of the decision, to use their creativity in coming up with possible reasons for failure. Otherwise, their emotions will likely inhibit their ability to accept the possibility of failure due to a defensive emotional response that leaves their minds much less capable of creatively envisioning possible problems.

Have each participant anonymously write out plausible reasons for this disaster. Anonymity is especially important here, due to the potential for political danger in describing potential problems. The facilitator gathers everyone’s statements and then highlights the key themes brought out as reasons for failure, focusing especially on reasons that would not typically be brought up, and ensuring anonymity in the process.

Step 5: Most serious problems
Discuss all the reasons brought up, paying particular attention to ones that are rude, impolitic, and dangerous to careers. Check for potential cognitive biases that might be influencing the assessments. Then, assess the probability and potential impact of each reason for failure.

Step 6: Fix problems
Decide on several failures that are most relevant to focus on, and brainstorm ways of solving these, including how to address potential mental blind spots. Also, discuss any evidence you might use that would serve as a red flag that the failure you are discussing is occurring or about to occur.

Step 7: Imagine success
You’ve addressed failure; now make sure you maximize success. Imagine that you are in a future where the decision succeeded far beyond what you expected. Have each participant anonymously write out plausible reasons for this success. Then, brainstorm ways of maximizing each of these reasons for success.

Step 8: Revise the decision
The leader revises the decision and implementation plan based on this feedback; if the revisions are serious enough, go through the failure-proofing process again with the new plan.


The first step to addressing GIGO involves recognizing that our minds suffer from cognitive biases that severely undermine our ability to gather quality information. Second, learn about these mental blind spots—most notably the planning fallacy, confirmation bias, and empathy gap, but also many others. Third, use evidence-based methods derived from neuroscience and behavioral economics, such as the failure-proofing technique, to defeat cognitive biases. From my experience consulting with hundreds of organizations of all sizes, as well as evaluating extensive scientific research, these three steps represent the best practice to empower you to gather top-quality data to inform truly effective decision making.

Register now for the Nov. 19, 2020, webinar, “H“How Quality Professionals Should Gather High-Quality Data to Inform Truly Effective Decision Making via Neuroscience,” where I will discuss how to combat biases in order to make more fact-based decisions.


About The Author

Gleb Tsipursky’s picture

Gleb Tsipursky

Gleb Tsipursky is on a mission to protect quality leaders from dangerous judgment errors known as cognitive biases by developing the most effective decision-making strategies. A best-selling author, he wrote Never Go With Your Gut: How Pioneering Leaders Make the Best Decisions and Avoid Business Disasters (2019). His expertise comes from 20+ years of consulting, coaching, and speaking and training as the CEO of Disaster Avoidance Experts, and over 15 years in academia as a behavioral economist and cognitive neuroscientist. Contact him at Gleb[at]DisasterAvoidanceExperts[dot]com, Twitter@gleb_tsipursky, Instagram@dr_gleb_tsipursky, LinkedIn, and register for his Wise Decision Maker Course.