Statistics Article

Donald J. Wheeler’s picture

By: Donald J. Wheeler

The daily Covid-19 pandemic values tell us how things have changed from yesterday, and give us the current totals, but they are difficult to understand simply because they are only a small piece of the puzzle. This article will present a global perspective on the pandemic and show where the United States stands in relation to the rest of the world at the end of the third week in June.

Here we will consider 27 countries that are home to 5 billion people (67% of the world's population). According to the European CDC database, which is the source for all of the data reported here, these 27 countries had more than 75 percent of the world’s confirmed Covid-19 cases and 86 percent of the Covid deaths as of June 20, 2020. So they should provide a reasonable perspective on the worldwide pandemic. Figure 1 lists these countries by region and gives the relevant Covid-19 counts and rates as of June 20, 2020.


Figure 1: Countries used for global summary

Taran March @ Quality Digest’s picture

By: Taran March @ Quality Digest

What is quality intelligence, exactly? It’s more than marketing spin. More, even, than the sum of its many control charts. It’s not collecting data simply to further go/no-go actions. And it doesn’t mean turning the cognitive wheel entirely over to artificial intelligence, either—far from it.

We might think of quality intelligence as a natural progression of quality control. It’s both granular, in that core quality tools underpin it, and forward-looking because quality data are used to improve not only products and processes but also operational performance. It’s very deliberate in that its goal is to wring the maximum value possible from reliable data.

To do this, quality intelligence employs four key tools: ensuring compliance, grading collected data, exploiting software, and implementing data strategically.

Ensuring compliance

People often assume that compliance applies solely to government or industry standards, but the term surfaces in many shop-floor conversations and processes. For instance, there is compliance to limits: Are data in specification? Are the appropriate statistical rules being met? There’s also compliance to procedures: Are people collecting data in the right way, and on time?

Ryan E. Day’s picture

By: Ryan E. Day

An organization can achieve great results when everyone is working together, looking at the same information generated from the same data, and using the same rules. Changes can be made that affect a company’s bottom line through operational improvements, product quality, and process optimization. There are quality intelligence (QI) solutions that can help reveal hidden opportunities.

Companies can save money and improve operational efficiency by effectively focusing resources on the problems that matter most from both a strategic and tactical perspective. A proper QI system makes this practical in several ways.

The QI advantage

With a QI system, data are captured and analyzed consistently in a central repository across the organization. This means there aren’t different interpretations of the truth, and there is alignment among those on the shop floor, site management, and corporate quality.

Alignment is possible because of a positive cascade of events:
• Notifications are sent to the appropriate people, and workflows trigger the required actions. This means people are appropriately accountable for addressing issues. Those issues can then be analyzed to understand recurring problems and how to avoid them.

Dirk Dusharme @ Quality Digest’s picture

By: Dirk Dusharme @ Quality Digest

Blame it on Moore’s law. We live in a digital Pangaea, a world of borderless data driven by technology, and the speed and density with which data can be transmitted and handled. It’s a world in which data-driven decisions cause daily fluctuations in markets and supply chains. Data come at us so fast that there is almost no way business leaders can keep abreast of changing supply chains and customer preferences, not to mention react to them.

Operating any kind of manufacturing today requires agility and the means to turn the flood of largely meaningless ones and zeros into something useful. The old ways of treating data as nothing more than digital paper won’t cut it in the “new normal.” We need to reimagine how we view quality.

Ryan E. Day’s picture

By: Ryan E. Day

It’s no secret that manufacturing companies operate in an inherently unstable environment. Every operational weakness poses a risk to efficiency, quality, and ultimately, to profitability. All too often, it takes a crisis—like Covid-19 shutdowns—to reveal operational weaknesses that have been hampering an organization for a long time.

The nature of the problem

It is not just a manufacturing company’s production facility that faces operational challenges, either. The entire organization must address a host of risks and challenges; shifting consumer and market trends necessitate improving agility and responsiveness; dynamic and global competition force innovation not only in product development, but also service and delivery; evolving sales channels, including online outlets, challenge established profit margins. And these challenges are not going away any time soon.

The real problem, however, lies not with the challenges themselves but with a company’s reluctance to see the operational weakness that makes it susceptible to a particular risk in the first place.

Eric Weisbrod’s picture

By: Eric Weisbrod

For nearly a century, statistical process control (SPC) has been the cornerstone of quality management and process control. But traditional SPC can’t keep up as the pace of manufacturing accelerates. Twenty-first century manufacturing lines produce multiple products and create thousands of data points in any given minute. Operations, quality, and Six Sigma teams are buried in an avalanche of data that they can’t possibly interpret.

Many organizations find that their teams are consumed by continually monitoring control charts and updating spreadsheets. They don’t have time to try to understand what all that data really mean—or how they can use them to drive meaningful action for their companies.

Even real-time data fall short when they’re siloed in different databases and accessible in only one location. The result is missed opportunities and wasted time as teams search for the details they need to achieve manufacturing optimization across the enterprise.

So how do you monitor what’s happening on the plant floor while it’s happening, without becoming so buried in data that agile analysis and response become impossible? And how do you scale your solution across multiple lines, shifts, and sites?

[Read More]

Puerto Rico Manufacturing Extension’s picture

By: Puerto Rico Manufacturing Extension

El-Com Systems Corp. is a wholly-owned subsidiary of El-COM Systems Solutions based in California. The local company has been in Puerto Rico since 2016 operating in Caguas. The company is dedicated to manufacturing complex electromechanical subsystems and assemblies for the global aerospace and defense industries. The company has 62 employees including operational and administrative personnel.

El-Com Systems was required to implement and certify its quality management system in accordance with the international standards of AS9100D for the aerospace sector. The challenge was not only to achieve the ISO certification, but also to achieve it simultaneously with an accelerated growth process, which required the hiring of additional employees for new production lines. Puerto Rico Manufacturing Extension, Inc. (PRiMEX), part of the MEP National Network, was recommended to provide support in this process.

[Read More]

Donald J. Wheeler’s picture

By: Donald J. Wheeler

Setting the process aim is a key element in the short production runs that characterize the lean production of multiple products. Last month in part one we looked at how to use a target-centered XmR chart to reliably set the aim. This column will describe aim-setting plans that use the average of multiple measurements.

The necessity of process predictability

All effective aim-setting procedures will be built upon the notion of a process standard deviation. Some estimate of this process dispersion parameter will be used in determining the decision rules for adjusting or not adjusting the process aim. When a process is operated predictably this idea of a single dispersion parameter makes sense.


Figure 1: When statistics serve as estimates

[Read More]

Gleb Tsipursky’s picture

By: Gleb Tsipursky

So many companies are shifting their employees to working from home to address the Covid-19 coronavirus pandemic. Yet they’re not considering the potential quality disasters that can occur as a result of this transition.

An example of this is what one of my coaching clients experienced more than a year before the pandemic hit. Myron is the risk and quality management executive in a medical services company with about 600 employees. He was one of the leaders tasked by his company’s senior management team with shifting the company’s employees to a work-from-home setup, due to rising rents on their office building.

Specifically, Myron led the team that managed risk and quality issues associated with the transition for all 600 employees to telework, due to his previous experience in helping small teams of three to six people in the company transition to working from home in the past. The much larger number of people who had many more diverse roles they had to assist now was proving to be a challenge. So was the short amount of time available to this project, which was only four weeks, and resulted from a failure in negotiation with the landlord of the office building.

[Read More]

Jay Arthur—The KnowWare Man’s picture

By: Jay Arthur—The KnowWare Man

Story update 5/6/2020: The charts and some data have been updated to reflect the data available on the date this article was published.

During the Covid-19 stay-at-home order in Colorado, I've become increasingly frustrated by Covid-19 charts. Most of what I see are cumulative column charts, which don't give any real insight into what's going on. Are we really flattening the curve?

So I decided to use the state's Covid-19 statistics for Colorado and Denver county, and see what I could learn using control charts. Control charts have been around for almost 100 years. They use formulas to calculate control limits that encompass 99.7 percent of the data points. This makes it easy to monitor any process and detect process shifts and "out of control" conditions.


Source: https://covid19.colorado.gov/case-data Click image for larger view.

[Read More]

Syndicate content