Featured Product
This Week in Quality Digest Live
Management Features
Gleb Tsipursky
Here’s the true path to junior staff success
Jennifer V. Miller
When did that word become forbidden on the job?
Tina Behers
First, leaders must overcome their fear of failure
Harish Jose
Using OC curves to generate reliability/confidence values
Nellie Wartoft
Select your team with intention and foster collaboration

More Features

Management News
Research commissioned by the Aerospace & Defense PLM Action Group with Eurostep and leading PLM providers
Improved design of polarization-independent beam splitters
New industry-recognized guidelines for manufacturing jobs
ASQ will address absence of internationally recognized ESG benchmarks
Helping organizations improve quality and performance
Leading technologies empowering the next generation of 3D engineering software solutions
EstateSpace offers digital estate management system
New e-book on quality system management now available for pre-order

More News

The QA Pharm


Good Metrics Practice for Quality Management Reviews

Data talk—but are you listening?

Published: Monday, June 13, 2016 - 11:45

Aquality management review of data with responsible company leadership is a current good manufacturing practices requirement. Quality management review procedures vary, but there seems to be a struggle with presenting data from across the quality management system in a meaningful and consistent manner when there are multiple contributors. Following are a few ways to organize your information in a more coherent fashion.

Report an opportunity for improvement. Reporting the opportunity helps to keep the focus on where to improve. For instance, report that 10 percent of investigations were overdue, rather than that 90 percent were completed on time.

A decrease shows improvement. A downward trend means improvement toward zero problems. For example, following root cause analysis training, recurring deviations  decreased from 20 percent to 5 percent in six months.

Compare vs. historical performance. Comparing current performance vs. a previous period helps to illustrate improvement. For example, you may point out that in the current quarter, 95 percent of supplier audits conducted vs. plan was accomplished, compared to 5 percent in the previous quarter.

Index metrics for relative comparisons. Indexing eliminates the effect of arbitrary data sets and helps to make comparisons. For example, there have been seven complaints per billion units manufactured year-to-date vs. 18 complaints for the same period the previous year.

Report absolute numbers for critical issues. Indexing should be avoided when the issue is critical or numbers are low. For example, report that two batches were recalled, rather than 0.2-percent batches were recalled.

Note events with markers on the timeline. When data are reported vs. time, it’s helpful to note significant events that had an effect on these data. For example, indicate that the trend line for environmental monitoring excursions started to increase when building construction started.

Define an unacceptable trend. Trends should be defined for run chart performance data. For example, consider the statistical process control method of five consecutive movements in the same direction, or seven consecutive points on same side of average.

Report measure of variability with averages. When reporting averages, be certain that averages data can be legitimately combined, and provide a measure of variability. For example, reporting an improvement with a decrease in the average number of 17 deviations per batch recorded for the last 10 batches, compared to an average of 25 deviations with the previous 10 batches, is misleading when the range of deviations increased from five to 45 compared to 23 to 28.

Chart scales must be sensitive for the intended purpose. The scale of a chart should be sufficiently large to illustrate the range of normal variation, and small enough to include all excursions within the time frame depicted. For example, the chart scale for percent overdue nonconformance investigations of 0 percent to 100 percent is inappropriate for a 12-month performance chart with normal variation of 3 percent to 6 percent. A more appropriate scale would be 0 percent to 12 percent. If the same time frame included an excursion of 18 percent, then a chart scale of 0 percent to 20 percent would be appropriate.

And always remember: data talk, opinions walk.


About The Author

The QA Pharm’s picture

The QA Pharm

The QA Pharm is a service of John Snyder & Co. Inc., provider of consulting services to FDA-regulated companies to build quality management systems and develop corrective actions that address regulatory compliance observations and communication strategies to protect against enforcement action. John E. Snyder worked at the lab bench, on the management board, and as an observer of the pharmaceutical industry for more than 30 years. His posts on The QA Pharm blog are straight talk about the challenges faced by company management and internal quality professionals. Synder is the author of Murder for Diversion (Jacob Blake Pharma Mystery Series Book 1).