That’s fake news. Real news COSTS. Please turn off your ad blocker for our web site.
Our PROMISE: Our ads will never cover up content.
Derek Benson
Published: Tuesday, October 29, 2019 - 12:03 As a quality professional, you’ve probably heard the famous quote from W. Edwards Deming, “In God we trust; all others bring data.” Thanks to technological advancements in our industry, data exist more abundantly than ever. This presents a new challenge for those tasked with extracting and communicating useful knowledge from data. Arguably, Deming’s intent was actually for all others to bring knowledge because data, alone, lacks context and interpretation. Knowledge, instead, is the outcome of reliable exploratory practices against your data, which, when followed, allow insights and opportunities to emerge. Problems with data can block access to this knowledge. Here are five common problems you may encounter in this data-intense world. Big data has been positioned as the way forward in business. Analyzing these data is supposed to help organizations by providing insights and facilitating better decision making, leading to more efficient operations, higher profits, and happier customers. The reality is that businesses are faced with an overwhelming amount of data. Although it may seem that more data insights are a good thing, unfortunately, much of them aren’t usable. This can lead to wasted time digging through bad data, just to get to the good. It can also lead to data overload, where organizations find they have a reduced capacity for decision making. Even worse, it can result in inaction, where no data-driven decisions are made at all. With all these available data, businesses are focusing, now more than ever, on data-driven decision making. However, research suggests that many decision makers don’t trust the insights their data are supposed to reveal. A Massey University study showed that many top executives lack trust in their data and prefer to rely on their intuition to make decisions. While it’s reasonable to question data, it can be debilitating when an organization completely loses faith in the quality of the data. In this case, decisions and actions are still being made—just without the assurances that quality data can provide. Once data distrust takes hold, it can significantly undermine all analytics efforts and remain a barrier until it is adequately addressed. Another common problem you may encounter in a data-intense world is applying improper analytical techniques. Among the variety of analytical methods, the best method to use depends on the analysis goal. There is often a lack of in-depth knowledge about the various techniques and when they should be applied. If the analytical approach doesn’t address what you are trying to accomplish, the results will be flawed. The impact of these erroneous results can be serious. Incorrect analysis, conclusions, and false results may establish an artificial structure on which an organization’s “data-driven” decisions are based. Therefore, it is essential to know whether an analysis method is suitable for the intended use. All processes and data sets exhibit variation. Common cause and special cause are two distinct types of variation, as defined in Deming’s statistical thinking and methods. Common cause variation is the natural fluctuation present in any system. Special cause variation affects only some of the output of the process, is often intermittent and unpredictable, and is not part of the system. Confusing common-cause and special-cause variation can lead to mistakes. The first mistake is to assume that variation is special cause when it is actually common cause. This can result in wasted time looking for a problem when one doesn’t exist. The second error is to presume that an anomaly is common cause variation when it is actually special cause, missing the opportunity to find and correct a problem. Another common issue with data analysis is believing that the order in which data were collected doesn’t matter. Ignoring time-based sequencing can be misleading. If you are using control charts to analyze your data, time-ordered data are a requirement for accurate control limits. Conclusions based on randomly ordered data can be skewed or incorrect. In manufacturing, especially, time is almost always a significant factor in data collection and should be treated as such. There are plenty of benefits to mastering the transformation of data into knowledge, especially as we experience the shift of Quality 4.0 from being merely an idea to becoming a reality. The sheer quantity of data available to quality teams is exponentially greater now. The opportunities for uncovering valuable insights in your data are endless; however, the pitfalls of data exploration are many. Establishing a sound data analytics process can set you on the path to true data-driven decision making. To learn more, attend our webinar, “Being an Explorer in a Data-Intense World,” on Nov. 5, 2019, at 11 a.m. Pacific/2 p.m. Eastern. Quality Digest does not charge readers for its content. We believe that industry news is important for you to do your job, and Quality Digest supports businesses of all types. However, someone has to pay for this content. And that’s where advertising comes in. Most people consider ads a nuisance, but they do serve a useful function besides allowing media companies to stay afloat. They keep you aware of new products and services relevant to your industry. All ads in Quality Digest apply directly to products and services that most of our readers need. You won’t see automobile or health supplement ads. So please consider turning off your ad blocker for our site. Thanks, Derek Benson is an application support manager for PQ Systems, developer of quality control software solutions. In his role as a member of the PQ Systems training team, Benson provides both on-site and remote assistance to customers around the globe. He brings a background of technical experience to his support role, and leads seminar and webinar training for new and practiced software customers. Recent consultancy clients include Amsted Rail, Timet, Grote Industries, and Continental Structural Plastics. Benson holds a bachelor’s degree in business from Ashford University. Five Problems With Data in a Data-Intense World
Establish a sound data analytics process for true, data-driven decision making
1. Information overload
2. Lack of confidence in the insights generated
3. Applying incorrect analytical techniques
4. Inability to distinguish types of variation in the data
5. Ignoring factors related to time in your analysis
Our PROMISE: Quality Digest only displays static ads that never overlay or cover up content. They never get in your way. They are there for you to read, or not.
Quality Digest Discuss
About The Author
Derek Benson
© 2023 Quality Digest. Copyright on content held by Quality Digest or by individual authors. Contact Quality Digest for reprint information.
“Quality Digest" is a trademark owned by Quality Circle Institute, Inc.
Comments
Bringing Data
I understand what you mean about bringing knowledge, but I took Deming's quote to mean bring data to support your assertion. This in some, or many, cases is a precursor to knowledge; perhaps an idea or hypothesis. The data provides context for the idea. Loose example, if I think climate change isn't real, but I dont know (lack of knowledge) the data I may provide for discussion/assessment may be that temperatures are remaining constant and weather patterns have stabilized.
This is not meant to be an argument. I'm just sharing a thought with hopes of discussion or feedback. All that said, excellent article. And no, the data/knowledge question is not all I took away. I'm a fan of your work with Quality Digest in general.