Donald J. Wheeler’s picture

By: Donald J. Wheeler

With the use of statistical software, many individuals are being exposed to more than just measures of location and dispersion. In addition to the average and standard deviation, they often find some funny numbers labeled as skewness and kurtosis. Since these numbers appear automatically, it is natural to wonder how they might be used in practice. In part one of this two-part column, I'll illustrate what the skewness and kurtosis parameters do. In part two I will look at the use of skewness and kurtosis statistics provided by software packages.

Since the previous sentence makes a distinction between a statistic and a parameter, we should begin there. Statistics are merely functions of the data. We find the value for a statistic by performing a set of arithmetic operations using a set of data. For example, we compute the average for a set of numbers by adding up all the numbers and dividing by the number of values in the sum. Thus, any time we have a collection of numbers we can compute any one of a number of statistics. Data plus arithmetic equals a statistic.

MIT News’s picture

By: MIT News

The migration of manufacturing from the United States to Asia could be having a significant impact on which advanced technologies are commercialized. Specifically, there is evidence that the shift in manufacturing is curtailing the development of emerging technologies in areas such as optoelectronics and advanced materials for the automotive industry.

Mike Micklewright’s picture

By: Mike Micklewright

In a 2009 book review for, a blogger named Khead quoted from Malcolm Gladwell’s book, The Tipping Point (Little, Brown and Co., 2000): “‘In order to create one contagious movement, you often have to create many small movements.’ This is one detail that explains his Rule of 150. The number 150 represents the maximum number of people that we are able to maintain a social relationship with… and that when a group, organization, or society begins to reach the number of 150, it is beneficial and necessary for a group to divide.”

Does this mean that, in the business world, in order to sustain a contagious movement (i.e., continuous improvement, innovation, stupendous customer service, and constant elimination of waste), an organization needs to subdivide itself every time it gets to about 150 employees, otherwise people won’t know of each other, or how they work or relate to each other? Yes, it does. Gladwell provides the example of a company that does exactly that. Gore Associates, a privately held, multimillion dollar, high-tech firm based in Delaware, divides itself every time employment reaches about 150 in any one facility. This is a number it stumbled upon, just as so many other organizations and societies have in the past, without Gladwell’s influence.

Harry Hertz’s picture

By: Harry Hertz

Every few years the American Society for Quality (ASQ) conducts a Future of Quality study. The first phase of the 2011 study, which involves the use of a Delphi process to identify the key forces of change, has been completed recently.

Using input from 150 panelists in 40 countries, the study identified and prioritized eight forces of change, as follows:

1. Global responsibility
2. Consumer awareness
3. Globalization
4. The increasing rate of change
5. The workforce of the future
6. Aging population
7. 21st-century quality
8. Innovation


ASQ asked me to think about how these forces will impact enterprise management and hence organizational quality. With ASQ’s permission I am sharing those thoughts with you, as well as the ASQ study team. I am dividing my comments into three groups of three: in the first group are overarching factors I believe will be the big influencers, the next group addresses impacts of these overarching factors, and in the last group are my (wild) speculations about possible outcomes. I propose we all meet in about 10 years and have a good laugh about these insights and predictions.

UC Davis Graduate School of Management’s picture

By: UC Davis Graduate School of Management

Frugal companies succeed commercially in part because they consistently control spending and are resourceful with people and products rather than cutting costs reactively, according to a new University of California, Davis, study. The paper, “Corporate Frugality: Theory, Measurement and Practice,” was co-authored by Anne M. Lillis of the University of Melbourne in Victoria, Australia. Forthcoming in the journal Contemporary Accounting Research, it explores frugality as a business culture rather than a reaction to recession.

“The research was motivated by all of the headlines that came out during the worst parts of the recession, indicating that firms were becoming frugal, as evidenced by layoffs and other cuts,” says Shannon W. Anderson, a professor at the UC Davis Graduate School of Management and co-author of the study. “My own experience working with companies on cost management made me very skeptical of the validity of characterizing these actions as evidence of frugality.”

The research confirmed that “today’s reactive, heavy-handed cost-cutting is the antithesis of true frugality,” and should not be mistaken for frugality, she said.

Microscan’s picture

By: Microscan

Optical character recognition (OCR) is a vision system tool that is widely used in the packaging industry. Like bar code technology, OCR is a data-capture methodology. Its primary advantage is that it encodes information in a format that is both machine- and human-readable, while bar codes and 2-D symbols are only machine-readable.

OCR turns printed text characters from a digital image into a string of characters that can be decoded by the system, and then moved through subsequent steps in the production process as defined by the control software.

The simplest and most reliable method for optical character recognition relies on specific OCR fonts and templates that are designed for these applications (see figure 1). However, machine vision’s powerful functionality incorporates teachable OCR systems that can be “trained” to recognize characters in a user-defined font—a useful feature given the wide array of available printing technologies and the range of printed characters produced by them.

Figure 1: A crisp OCR font is the simplest and most reliable method for decoding.

ISO’s picture


(ISO: Geneva) -- A new international standard detailing the level of competency required by those responsible for verifying greenhouse gas (GHG) emissions has been published. It is the latest addition to the toolbox of standards from the International Organization for Standardization (ISO) for addressing climate change and supporting emissions trading schemes.

With a growing global awareness of the need for environmental protection and sustainability, organizations are eager to demonstrate their efforts to inventory, report, and reduce GHG emissions. In order to assure the credibility of their claims, many of these organizations are turning to third-party bodies to validate and verify emission assertions.

ISO 14066:2011—“Greenhouse gases—Competence requirements for greenhouse gas validation teams and verification teams,” spells out the competence requirements of the personnel undertaking the various validation or verification activities within the team appointed for the task. It is intended to achieve consistency in the global carbon market and maintain public confidence in GHG reporting and other communications.

MIT News’s picture

By: MIT News

Power turbines may be a mature business, but they are also a booming one. This year General Electric received record orders for jet engines, and because natural gas is currently cheap, worldwide demand is increasing for gas turbines used in power plants, says Jeffrey Immelt, GE’s chairman and CEO.

To compete, the company is introducing new products based on innovations such as improved composites for fan blades and resilient alloys that allow for high-temperature, efficient operation. But at least as important from a competitive perspective are advances in the technology used to make turbines, which can lower costs and make new designs possible.

At GE’s global research headquarters in Niskayuna, New York, researchers are working on a new machining tool that uses a combination of a cutting disk and an electrical arc. The tool cuts through high-strength alloys three times as fast as the conventional alternatives, and it reduces energy consumption by 25 percent, bringing down manufacturing costs. Because it uses less force than conventional machining, the technology also makes it possible to conceive of new designs that might otherwise break during the process.

NIST’s picture


The electromagnetic force has gotten a little stronger, gravity a little weaker, and the size of the smallest “quantum” of energy is now known a little better. The National Institute of Standards and Technology (NIST) has posted the latest internationally recommended values of the fundamental constants of nature.

The constants, which range from relatively famous (the speed of light) to the fairly obscure (Wien frequency displacement law constant) are adjusted every four years in response to the latest scientific measurements and advances. These latest values arrive on the verge of a worldwide vote this fall on a plan to redefine the most basic units in the International System of Units (SI), such as the kilogram and ampere, exclusively in terms of the fundamental constants.

The values are determined by the Committee on Data for Science and Technology (CODATA) Task Group on Fundamental Constants, an international group that includes NIST members.

NIST’s picture


Terahertz radiation can penetrate numerous materials—plastic, clothing, paper, and some biological tissues—making it an attractive candidate for applications such as concealed weapons detection, package inspection, and imaging skin tumors. However, to date there is no standard method for measuring the absolute output power of terahertz lasers, one source of this type of radiation. Now, researchers at the National Institute of Standards and Technology (NIST) have found that dense arrays of extra-long carbon nanotubes absorb nearly all light of long wavelengths, and thus are promising coatings for prototype detectors intended to measure terahertz laser power as discussed in the article, “Far infrared thermal detectors for radiometry using a carbon nanotube array,” by J .H. Lehman, B. Lee, and E. N. Grossman (Applied Optics, July 18, 2011).

The research is part of NIST’s effort to develop the first reference standards for calibrating lasers that operate in the terahertz range, from the far infrared at wavelengths of 100 micrometers to the edge of the microwave band at 1 millimeter.

Syndicate content