Innovation Article

Celia Paulsen’s picture

By: Celia Paulsen

Artificial intelligence (AI)-powered robots, 3D printing, the internet of things (IoT)... there’s a whole world of advanced manufacturing technology and innovation just waiting for small and medium-sized manufacturers (SMMs) that want to step up their digital game. Unfortunately, manufacturing digitization can present some fundamental challenges, like added cybersecurity risk.

So how do smaller manufacturers increase their advanced manufacturing technology capabilities while balancing the associated risks? Let’s dissect some of the top challenges for SMMs.

1. Cybersecurity plan

All technology implementations should begin with a plan that includes cybersecurity. A sound cybersecurity plan not only helps manufacturers identify and improve current security protocols, it also positions them to manage future risk.

Key stakeholders should identify the most critical information assets to protect, map how that information flows through the organization (currently and with any proposed technology or process changes), and determine the level of risk if that information were lost or compromised.

NIST’s picture

By: NIST

Scientists at the National Institute of Standards and Technology (NIST) and the Massachusetts Institute of Technology (MIT) have demonstrated a potentially new way to make switches inside a computer’s processing chips, enabling them to use less energy and radiate less heat.

The team has developed a practical technique for controlling magnons, which are essentially waves that travel through magnetic materials and can carry information. To use magnons for information processing requires a switching mechanism that can control the transmission of a magnon signal through the device. 

Although other labs have created systems that carry and control magnons, the team’s approach brings two important firsts: Its elements can be built on silicon rather than exotic and expensive substrates, as other approaches have demanded. It also operates efficiently at room temperature, rather than requiring refrigeration. For these and other reasons, this new approach might be more readily employed by computer manufacturers.

Michael Weinold’s picture

By: Michael Weinold

After nearly 130 years in business and a series of breakthrough innovations that shaped the way we light up our homes, General Electric has sold its lighting division to the U.S.-based market leader in smart homes, Savant, for a reported $250 million (£198 million). Although a licensing agreement means that consumers will continue to see GE-branded light bulbs in stores, the sale marks the end of an era for this quintessential giant of the illumination industry.

GE traces its roots to Thomas Edison’s invention of the electric light bulb in 1879. Since then, GE Lighting and its direct legal predecessors have shaped illumination technology like no other company: building on Edison’s legacy, the company went on to patent the tungsten filament in 1912 and the first practical fluorescent tubes in 1927.

Jeffrey Phillips’s picture

By: Jeffrey Phillips

Throughout human history we’ve constantly sought out tools and capital to make us more productive. From the formation of basic tools to assist in farming to real cultivation and shaping of the land for greater yields, humankind learned to grow food. Further research into genetics, fertilizers, and pesticides enabled us to rapidly scale food production. From early sweatshops to almost fully automated factories, we’ve learned how to scale manufacturing and get far more productivity from fewer workers and more machinery and automation.

In this manner, we’ve learned to improve the deployment of human labor, land, tools, machinery, and other capital to improve our quality of life. Now, we must fully engage the asset that we have the most of that is producing the least for us: data. It’s time to put our data to work.

Kayla Wiles’s picture

By: Kayla Wiles

A new laser treatment method could potentially turn any metal surface into a rapid bacteria killer just by giving it a different texture, researchers say. In a new study, they demonstrated that this technique allows the surface of copper to immediately kill off superbugs such as MRSA.

“Copper has been used as an antimicrobial material for centuries,” says Rahim Rahimi, an assistant professor of materials engineering at Purdue University. “But it typically takes hours for native copper surfaces to kill off bacteria. We developed a one-step laser-texturing technique that effectively enhances the bacteria-killing properties of copper’s surface.”

A laser prepares to texture the surface of copper, enhancing its antimicrobial properties. (Credit: Kayla Wiles/Purdue)

The technique is not yet tailored to killing viruses such as the one responsible for the Covid-19 pandemic, which is much smaller than bacteria.

Taran March @ Quality Digest’s picture

By: Taran March @ Quality Digest

What is quality intelligence, exactly? It’s more than marketing spin. More, even, than the sum of its many control charts. It’s not collecting data simply to further go/no-go actions. And it doesn’t mean turning the cognitive wheel entirely over to artificial intelligence, either—far from it.

We might think of quality intelligence as a natural progression of quality control. It’s both granular, in that core quality tools underpin it, and forward-looking because quality data are used to improve not only products and processes but also operational performance. It’s very deliberate in that its goal is to wring the maximum value possible from reliable data.

To do this, quality intelligence employs four key tools: ensuring compliance, grading collected data, exploiting software, and implementing data strategically.

Ensuring compliance

People often assume that compliance applies solely to government or industry standards, but the term surfaces in many shop-floor conversations and processes. For instance, there is compliance to limits: Are data in specification? Are the appropriate statistical rules being met? There’s also compliance to procedures: Are people collecting data in the right way, and on time?

Ryan E. Day’s picture

By: Ryan E. Day

An organization can achieve great results when everyone is working together, looking at the same information generated from the same data, and using the same rules. Changes can be made that affect a company’s bottom line through operational improvements, product quality, and process optimization. There are quality intelligence (QI) solutions that can help reveal hidden opportunities.

Companies can save money and improve operational efficiency by effectively focusing resources on the problems that matter most from both a strategic and tactical perspective. A proper QI system makes this practical in several ways.

The QI advantage

With a QI system, data are captured and analyzed consistently in a central repository across the organization. This means there aren’t different interpretations of the truth, and there is alignment among those on the shop floor, site management, and corporate quality.

Alignment is possible because of a positive cascade of events:
• Notifications are sent to the appropriate people, and workflows trigger the required actions. This means people are appropriately accountable for addressing issues. Those issues can then be analyzed to understand recurring problems and how to avoid them.

Dirk Dusharme @ Quality Digest’s picture

By: Dirk Dusharme @ Quality Digest

Blame it on Moore’s law. We live in a digital Pangaea, a world of borderless data driven by technology, and the speed and density with which data can be transmitted and handled. It’s a world in which data-driven decisions cause daily fluctuations in markets and supply chains. Data come at us so fast that there is almost no way business leaders can keep abreast of changing supply chains and customer preferences, not to mention react to them.

Operating any kind of manufacturing today requires agility and the means to turn the flood of largely meaningless ones and zeros into something useful. The old ways of treating data as nothing more than digital paper won’t cut it in the “new normal.” We need to reimagine how we view quality.

Ryan E. Day’s picture

By: Ryan E. Day

It’s no secret that manufacturing companies operate in an inherently unstable environment. Every operational weakness poses a risk to efficiency, quality, and ultimately, to profitability. All too often, it takes a crisis—like Covid-19 shutdowns—to reveal operational weaknesses that have been hampering an organization for a long time.

The nature of the problem

It is not just a manufacturing company’s production facility that faces operational challenges, either. The entire organization must address a host of risks and challenges; shifting consumer and market trends necessitate improving agility and responsiveness; dynamic and global competition force innovation not only in product development, but also service and delivery; evolving sales channels, including online outlets, challenge established profit margins. And these challenges are not going away any time soon.

The real problem, however, lies not with the challenges themselves but with a company’s reluctance to see the operational weakness that makes it susceptible to a particular risk in the first place.

Eric Weisbrod’s picture

By: Eric Weisbrod

For nearly a century, statistical process control (SPC) has been the cornerstone of quality management and process control. But traditional SPC can’t keep up as the pace of manufacturing accelerates. Twenty-first century manufacturing lines produce multiple products and create thousands of data points in any given minute. Operations, quality, and Six Sigma teams are buried in an avalanche of data that they can’t possibly interpret.

Many organizations find that their teams are consumed by continually monitoring control charts and updating spreadsheets. They don’t have time to try to understand what all that data really mean—or how they can use them to drive meaningful action for their companies.

Even real-time data fall short when they’re siloed in different databases and accessible in only one location. The result is missed opportunities and wasted time as teams search for the details they need to achieve manufacturing optimization across the enterprise.

So how do you monitor what’s happening on the plant floor while it’s happening, without becoming so buried in data that agile analysis and response become impossible? And how do you scale your solution across multiple lines, shifts, and sites?

[Read More]

Syndicate content