Peter J. Sherman’s picture

By: Peter J. Sherman

For the record, I’m a registered Independent voter. With that said… Recently, President Obama’s nominee for commerce secretary, Sen. Judd Gregg, unexpectedly withdrew from the position. Was it because of skeletons in his closet: unreported nanny taxes, inappropriate personal lifestyle, questionable business practices? No, the reason evidently had to with a disagreement over who would take control of the Census Bureau—the Commerce Department or the Obama Administration. At the center of the debate, however, was how the census would be conducted. Would it use the traditional actual count method or statistical sampling techniques?

Before explaining the differences between both approaches, I need to share some background information on the census. Every 10 years, our nation conducts the ritual of counting the population. The purpose of the census is more than just reporting it in high-school textbooks. The census determines how many seats each state gets in the House of Representatives and helps to determine where the district lines are drawn within each state. Literally, billions of dollars are at stake because these population-driven financing formulas can determine where federal spending is allocated.

Steve Rogers’s default image

By: Steve Rogers

Today’s manufacturers must develop products quickly and inexpensively to meet the demands of a competitive marketplace. Rigorous testing to meet North American product certification requirements may prove to be a time-intensive process. If not properly planned, third-party approvals can inadvertently delay product launch plans and increase overhead costs.

In one common example, a third-party testing lab finds a problem with a product, and must ship the faulty item back to the manufacturer for review and modification. Once corrected, the manufacturer ships the product to the lab again for additional testing. If another issue is found, the product returns to the manufacturer for yet another round of review and changes. This process continues until the lab approves a product that meets quality standards. With a certified in-house lab, the development team can bypass this back-and-forth process and make product adjustments accordingly and without delays.

Kurt Boveington’s picture

By: Kurt Boveington

With all of the quality lingo over the years, “right-the-first-time,” “prevention vs. detection,” “total quality,” “Six Sigma,” “ kaizen, ” and “continuous improvement,” Intermec Media, a label converter in Fairfield, Ohio, has taken this to another level and applied these concepts to their own internal audit process. As with any process, it’s better to do it right the first time; hopefully the process will prevent problems from reoccurring, and allow itself to be continually improved so that it can adapt to an ever-changing business.

Thomas R. Cutler’s picture

By: Thomas R. Cutler

Plan-do-check-act (PDCA) is an iterative four-step problem-solving process typically used in business process improvement. It’s also known as the Deming Cycle. When W. Edwards Deming postulated this process, there was no such system as e-commerce.

PDCA has been rarely applied to websites or to the quality of the enterprise resource planning (ERP) implementation, which is often the backbone of the e-commerce website.  This is particularly true in SAP ERP implementations. Sam Bayer, Ph.D., founder of b2b2dot0 suggests, “The e-commerce website provides the ultimate ‘check’ of the quality of the ERP configuration, business rules, and data. Real-time integrated e-commerce websites can provide a positive effect on overall order entry quality. If they are standalone websites that are separately maintained, they will inject defects into the process.”

William A. Levinson’s picture

By: William A. Levinson

Carbon dioxide emissions are symptomatic of energy consumption in manufacturing, especially in transportation. Therefore initiatives to reduce them often cut supply chain costs as well. However, the exaggerated focus on carbon emissions is dysfunctional and it may overlook other cost-reduction opportunities.

Costs and benefits of greenhouse gas reduction

One of the most discussed environmental benefits of greenhouse gas reduction is mitigation of global warming. The costs related to greenhouse gas emission reduction are associated with sequestration of carbon dioxide, wind, and solar generation techniques that cannot pass a managerial economic analysis on their own merits, and non-value-adding carbon credit trading programs. Serious questions must be asked as to whether it’s worth hundreds of billions of dollars for marginal mitigation of rising sea levels, desertification, and so on.

Daniel M. Smith’s default image

By: Daniel M. Smith

Why would anyone start a new metrology business in this economic climate? Why would they do it in Michigan, the epicenter of the automotive industry recession? The short answer is that if you can identify a clear need in the marketplace for your product and have the ability and expertise to bring it to market, then the risk is minimal, manageable, and worth it. That was the thinking that lead FixLogix, an Olivet-based CMM fixture manufacturer, to develop and market a new product in the middle of a recession.

Modular-fixture systems for coordinate measuring machines (CMM) were introduced in the late 1980’s. Since then, there have been tremendous advances in CMM technology. Advances in computer systems and software led to the development of new machine designs. Software-based 3-D error compensation allowed for simpler, less expensive machine structures. Graphical user interfaces simplified software-based alignments, reducing the need to square the work-piece up to the machine axis. Meanwhile, modular fixture systems for CMMs remained relatively unchanged, based on the obsolete concept of building the holding fixture to gauge tolerances. As time went on, and CMM prices declined, the cost of a modular fixture system became a much greater percentage of the total solution.

Belinda Jones’s picture

By: Belinda Jones

As energy prices continue to soar and the public—with increased awareness and concern for the environment—continue to demand environmental accountability from manufacturers, companies are looking long and hard at ways to decrease their influence on the environment. But there are competing goals. On one hand, there’s social and environmental accountability, on the other there are cost and quality issues associated with managing energy consumption.

By teaming with EnerNoc, a company headquartered in Boston, Massachusetts, that helps institutional and industrial organizations use energy more intelligently, one large equipment manufacturer is reaping the benefits of both worlds.

Tammi Cooper, Ph.D. and William H. Denney, Ph.D.’s default image

By: Tammi Cooper, Ph.D. and William H. Denney, Ph.D.

Second only to leadership, strategic planning has been likely more written about than any other management subject.

In studying leadership, we seek to learn the emotional characteristics that define a successful entrepreneur. What makes Bill Gates or Jack Welch succeed when so many CEOs struggle? We often begin with the cult of personality, but soon discover that organizational success is intricately tied to planning and execution.

So we look at strategic planning and too often find chaos. Studies have shown that up to nine out of 10 strategic plans fail. An analysis by Fortune magazine found that:

  • 60 percent of organizations don’t link strategy to budgeting

  • 75 percent don’t link employee incentives to strategy

  • 86 percent of business leaders spend less than one hour per month discussing strategy

  • 95 percent of workers don’t understand their organization’s strategy

But why do good intentions fall short?

Thomas R. Cutler’s picture

By: Thomas R. Cutler

Respecting a quality manager’s opinion is meaningless unless there’s enterprisewide buy-in to ideas and quality initiatives. Rarely do the individuals serving on a lean initiative, continued process-improvement team learn the scientifically proven communication techniques that will persuade others to change their perceptions about quality.

Often a lead quality control professional is perceived as a police officer rather than a trusted advisor within the team or an established guru in the field. Reminiscent of Rodney Dangerfield, more than 72 percent of quality professionals surveyed by my company recently reported they receive too little respect from others within their organization. The same survey revealed that all 392 North American quality managers surveyed wished to be seen by other managers as having a proactive and valuable opinion on how to make the company successful, and wanted to break down the walls that separate quality from other departments.

Driving buy-in from the organization for valuable quality initiatives—such as failure mode and effects analysis, good manufacturing practices, continuous improvement, ISO 9001, ISO 14001, the Baldrige Criteria, and governmental compliance or regulatory requirements—is much easier when quality professionals function effectively.

Frank Gray’s default image

By: Frank Gray

In sports, it's always the fundamentals that your coaches emphasize, like the techniques that you first learn when you’re starting to play baseball—how to hold the ball properly, how to stand and hold a bat, or how to field a grounder. The basics about the sport, if performed perfectly, yield a positive outcome and add to the whole game experience. It's the same with calibrations. With all the new technology and advancements that come with calibration tracking and performance, we tend to overlook the basics, which in the long run, may lead to an unfavorable outcome during an audit.
Calibration documentation is a function of your quality unit. Engineering, metrology team, vendor, or production personnel may perform the task of calibration, but the documentation review, verification, and follow-up is a quality function and should be treated as such. Whether you use upscale calibration software that complies with Food and Drug Administration (FDA) 21 CFR part 11, or a paper documentation trail, your systems still may seem to lack the required quality functions to keep the systems compliant. Why is it that during an audit, the auditors always find a documentation or traceability issue within your calibrations? It's those little issues that make it seem like you have no control of your systems.

Syndicate content