Michael Pease’s picture

By: Michael Pease

Digital transformation (DX) promises increased competitiveness, optimized processes, and profitability through big data, along with improved employee and customer relations. Gathering data is essential in the 21st century, data-oriented environment and requires flexible, interconnected components. Businesses will need people with the specialized skills to implement and optimize all of this. Beyond that, each firm will have to work with its unique DX plans and existing IT environment.

Legacy components can limit DX efforts

DX efforts are typically divided into three phases: digitization (transitioning from analog to digital data), digitalization (processing and analyzing digital data), and digital transformation (building on digitalization to optimize the business).

ISO’s picture

By: ISO

There’s more than one path to service management. It refers to all the activities, policies, and processes that organizations use for deploying, managing, and improving IT service provision. In today’s technology-driven corporate landscape, the two leading methodologies come from the world of software development and information technology (IT).

Implementing a service management system in a structured way brings many benefits to an organization, such as greater efficiencies and improved customer relations. Organizations generally use a predefined framework of best practices and standard processes to provide a disciplined approach to service implementation. More recently, however, a new approach has taken the world by storm, putting a fresh spin on how to better develop and deliver software.

Enter Agile, a methodology that has given greater flexibility to the corporate world. Why is it so popular? Because it brings agility and creativity to the way we develop projects. It also dovetails neatly with more structured frameworks such as ISO/IEC 20000-11 for IT service management (ITSM) systems.

Renay San Miguel’s picture

By: Renay San Miguel

Machine learning came along at just the right time. The world is now awash in more data than ever before, and computer algorithms that can learn and improve as they perform data analysis promise to help scientists handle that information overload.

Yet researchers who think that machine learning by itself can help solve complex problems in science, engineering, and medicine should strive for a more balanced approach, says Roman Grigoriev, part of a School of Physics team with new research suggesting a hybrid approach for conducting science that blends new-era technologies, old-school experimentation, and theoretical analysis. The research suggests faster solutions to complex, data-intensive riddles involving such issues as cancer, earthquakes, weather forecasts, and climate change.

Gleb Tsipursky’s picture

By: Gleb Tsipursky

When a threat seems clear to you, it’s hard to believe others will deny it. Yet smart people deny serious risks surprisingly often.

A case-in-point example comes from my experience helping a midsize regional insurance company conduct a strategic pivot to thrive in the post-Covid world in January 2021. While doing a pivoting audit, I observed the underwriting department failing to address serious long-term risks for a number of industries resulting from shifts in habits and norms due to the pandemic.

For example, many companies committed to having many employees work from home permanently, ranging from innovative tech companies like Dropbox to traditional ones such as the insurance giant Nationwide.

Silke von Gemmingen’s picture

By: Silke von Gemmingen

Due to digitalization in Industry 4.0, internal logistics is subject to constant change. Internal traceability—i.e., tracking goods in the warehouse or production facility—increasingly plays a key role. Manufacturers and consumers are placing more emphasis on the safety and quality of products. Costly and image-damaging complaints must therefore be avoided. Automation systems can help to optimize goods control here and at the same time facilitate and accelerate the work of the operators—saving time and cost.

An example of the successful implementation of a system for internal traceability in intralogistics can be found at Schnellecke Logistics. At the Dingolfing site in Germany, a scalable quality assurance solution from Pose Automation GmbH in Kleve ensures comprehensive photo documentation for incoming and outgoing goods inspection. The P.Portal used in a logistics hall takes over the analysis and documentation of the condition of the goods and uses bright USB3 vision industrial cameras from IDS.

Multiple Authors
By: Caroline Zimmerman, Theodoros Evgeniou

People often associate the term “data literacy” with mastering a litany of technical skills: SQL for data querying, Python for data analysis, and Tableau for data visualization, to name a few. However, one skill that is less discussed and has great power to scale data-guided decision making across the organization is far more basic, though not necessarily straightforward to learn: the ability to ask great questions of a data team.

Although this skill can be about setting big strategic directions, it is more often about defining narrower, possibly daily, requests for the team. But what, exactly, are good questions, and how does one go about skilling up an organization in asking them? With two players in the luxury retail space, Daragh Kelly, VP of data and analytics at Burberry; and Joyce Weng, MD of Bulgari UK, we discussed the answer to this and other questions at a recent INSEAD Tech Talk on enabling data-guided leadership and decision making. You can watch the full recording here or continue reading for the key takeaways from our discussion.

David L. Chandler’s picture

By: David L. Chandler

This story was originally published by MIT News.

As the world continues to warm, many arid regions that already have marginal conditions for agriculture will be increasingly under stress, potentially leading to severe food shortages. Now, researchers at MIT have come up with a promising process for protecting seeds from the stress of water shortage during their crucial germination phase, and even providing the plants with extra nutrition at the same time.

The process, undergoing continued tests in collaboration with researchers in Morocco, is simple and inexpensive, and could be widely deployed in arid regions, the researchers say. The findings were reported in the journal Nature Food, in a paper by MIT professor of civil and environmental engineering Benedetto Marelli, MIT doctoral student Augustine Zvinavashe, and eight others at MIT and at the King Mohammed VI Polytechnic University in Morocco.

Gary Lyng’s picture

By: Gary Lyng

To uncover the value in data, analysts need powerful combinations of tools to locate data, wherever they are, and regardless if they are structured or unstructured. Most companies don’t realize that their current data-search approaches can’t access distributed information and can’t extract information within unstructured documents. This severely limits their ability to translate data into profit.

How does big data search and retrieval work?

Traditionally, when you think of search-and-retrieval tools, you picture a search tool that lets you find data based on the parameters you define. This is certainly the case; however, it shouldn’t be the only thing your retrieval tools and process can do, or else your data management—and analysis—will be severely deficient.

Search tools are useful for finding specific files or data, but they won’t reveal anything else unless you ask for it specifically. Although that sounds like a streamlined process, it excludes other data hidden on your across-distributed-data infrastructure. What about those misnamed files that you’ll never find?

Lawrence Livermore National Laboratory’s picture

By: Lawrence Livermore National Laboratory

A Lawrence Livermore National Laboratory (LLNL) scientist and collaborators have demonstrated the first-ever “defect microscope” that can track how populations of defects deep inside macroscopic materials move collectively.

The research, which appeared last month in Science Advances, shows a classic example of a dislocation (line defect) boundary, then demonstrates how these same defects move exotically just at the edge of melting temperatures.

“This work presents a large step forward for materials science, physics, and related fields, as it offers a unique new way to view the ‘intermediate scales’ that connect microscopic defects to the bulk properties they cause,” says Leora Dresselhaus-Marais, a former Lawrence fellow and now assistant professor of materials science and engineering at Stanford University.

Multiple Authors
By: Adrian Hernandez, C. Michael White

T

he U.S. Food and Drug Administration (FDA) regularly inspects manufacturing facilities to ensure that drugs meet rigorous quality standards. These standards are vital to protect patients from drugs that are incorrectly dosed, contaminated, or ineffective.

But over the past few years, tens of millions of doses of prescription and over-the-counter drugs have failed FDA quality expectations. This includes the ongoing 2018 recall of thousands of batches of popular blood pressure, diabetes, and acid reflux medications containing the probable carcinogen NDMA.

For perspective, the number of individual tablets and capsules for prescription and over-the-counter drugs entering the United States each year is counted in the trillions.

Syndicate content