Featured Product
This Week in Quality Digest Live
Innovation Features
NIST
Having more pixels could advance everything from biomedical imaging to astronomical observations
Chris Caldwell
Significant breakthroughs are required, but fully automated facilities are in the future
Leah Chan Grinvald
Independent repair shops are fighting for access to vehicles’ increasingly sophisticated data
Adam Zewe
How do these systems differ from other AI?
Chris Anderson
How this technology drives transformational change

More Features

Innovation News
Easy to use, automated measurement collection
A tool to help detect sinister email
Funding will scale Aigen’s robotic fleet, launching on farms in spring 2024
High-end microscope camera for life science and industrial applications
Three new models for nondestructive inspection
Machine learning identifies flaws in real time
Developing tools to measure and improve trustworthiness
Advancing additive manufacturing

More News

Derek Benson

Innovation

Five Problems With Data in a Data-Intense World

Establish a sound data analytics process for true, data-driven decision making

Published: Tuesday, October 29, 2019 - 11:03

As a quality professional, you’ve probably heard the famous quote from W. Edwards Deming, “In God we trust; all others bring data.” Thanks to technological advancements in our industry, data exist more abundantly than ever. This presents a new challenge for those tasked with extracting and communicating useful knowledge from data.

Arguably, Deming’s intent was actually for all others to bring knowledge because data, alone, lacks context and interpretation. Knowledge, instead, is the outcome of reliable exploratory practices against your data, which, when followed, allow insights and opportunities to emerge. Problems with data can block access to this knowledge. Here are five common problems you may encounter in this data-intense world.

1. Information overload

Big data has been positioned as the way forward in business. Analyzing these data is supposed to help organizations by providing insights and facilitating better decision making, leading to more efficient operations, higher profits, and happier customers. The reality is that businesses are faced with an overwhelming amount of data. Although it may seem that more data insights are a good thing, unfortunately, much of them aren’t usable. This can lead to wasted time digging through bad data, just to get to the good. It can also lead to data overload, where organizations find they have a reduced capacity for decision making. Even worse, it can result in inaction, where no data-driven decisions are made at all.

2. Lack of confidence in the insights generated

With all these available data, businesses are focusing, now more than ever, on data-driven decision making. However, research suggests that many decision makers don’t trust the insights their data are supposed to reveal. A Massey University study showed that many top executives lack trust in their data and prefer to rely on their intuition to make decisions. While it’s reasonable to question data, it can be debilitating when an organization completely loses faith in the quality of the data. In this case, decisions and actions are still being made—just without the assurances that quality data can provide. Once data distrust takes hold, it can significantly undermine all analytics efforts and remain a barrier until it is adequately addressed.

3. Applying incorrect analytical techniques

Another common problem you may encounter in a data-intense world is applying improper analytical techniques. Among the variety of analytical methods, the best method to use depends on the analysis goal. There is often a lack of in-depth knowledge about the various techniques and when they should be applied. If the analytical approach doesn’t address what you are trying to accomplish, the results will be flawed. The impact of these erroneous results can be serious. Incorrect analysis, conclusions, and false results may establish an artificial structure on which an organization’s “data-driven” decisions are based. Therefore, it is essential to know whether an analysis method is suitable for the intended use.

4. Inability to distinguish types of variation in the data

All processes and data sets exhibit variation. Common cause and special cause are two distinct types of variation, as defined in Deming’s statistical thinking and methods. Common cause variation is the natural fluctuation present in any system. Special cause variation affects only some of the output of the process, is often intermittent and unpredictable, and is not part of the system. Confusing common-cause and special-cause variation can lead to mistakes. The first mistake is to assume that variation is special cause when it is actually common cause. This can result in wasted time looking for a problem when one doesn’t exist. The second error is to presume that an anomaly is common cause variation when it is actually special cause, missing the opportunity to find and correct a problem.

 5. Ignoring factors related to time in your analysis

Another common issue with data analysis is believing that the order in which data were collected doesn’t matter. Ignoring time-based sequencing can be misleading. If you are using control charts to analyze your data, time-ordered data are a requirement for accurate control limits. Conclusions based on randomly ordered data can be skewed or incorrect. In manufacturing, especially, time is almost always a significant factor in data collection and should be treated as such.

There are plenty of benefits to mastering the transformation of data into knowledge, especially as we experience the shift of Quality 4.0 from being merely an idea to becoming a reality. The sheer quantity of data available to quality teams is exponentially greater now. The opportunities for uncovering valuable insights in your data are endless; however, the pitfalls of data exploration are many. Establishing a sound data analytics process can set you on the path to true data-driven decision making.

To learn more, attend our webinar, “Being an Explorer in a Data-Intense World,” on Nov. 5, 2019, at 11 a.m. Pacific/2 p.m. Eastern.

 

Discuss

About The Author

Derek Benson’s picture

Derek Benson

Derek Benson is an application support manager for PQ Systems, developer of quality control software solutions. In his role as a member of the PQ Systems training team, Benson provides both on-site and remote assistance to customers around the globe. He brings a background of technical experience to his support role, and leads seminar and webinar training for new and practiced software customers. Recent consultancy clients include Amsted Rail, Timet, Grote Industries, and Continental Structural Plastics. Benson holds a bachelor’s degree in business from Ashford University.

 

Comments

Bringing Data

I understand what you mean about bringing knowledge,  but I took Deming's quote to mean bring data to support your assertion.  This in some, or many, cases is a precursor to knowledge; perhaps an idea or hypothesis.  The data provides context for the idea.  Loose example, if I think climate change isn't real, but I dont know (lack of knowledge) the data I may provide for discussion/assessment may be that temperatures are remaining constant and weather patterns have stabilized.

This is not meant to be an argument.  I'm just sharing a thought with hopes of discussion or feedback. All that said, excellent article.  And no, the data/knowledge question is not all I took away.  I'm a fan of your work with Quality  Digest in general.