



© 2023 Quality Digest. Copyright on content held by Quality Digest or by individual authors. Contact Quality Digest for reprint information.
“Quality Digest" is a trademark owned by Quality Circle Institute, Inc.
Published: 09/30/2011
Now I don’t mean to brag, but I make a mean filet mignon... usually. The preparation always involves a good soaking in my secret marinade recipe (McCormick’s and red wine) then grilling on the BBQ turned up to its “ludicrous” setting. So why the occasional extra char in the char-broil?
Two reasons. One, because the tool I use to measure cooking time is none other than my own seat-o’-the-pants internal clock, which seems to have an intermittent calibration issue not covered by warranty; and two, I don’t use a thermometer at all.
If I were a professional chef, then a top-grade, instant-read, digital thermometer and the know-how to interpret the temperature readings would be imperative. As a weekend warrior, I should at least use a food thermometer and a decent cookbook. But I don’t. And who cares, other than my poor wife, who is then forced to forage for her evening’s nutrition?
This casual attitude does not work so well when applied to more serious metrological concerns. There are more important issues that deal with temperature measurements and their interpretation. One of these is climate change.
It’s obvious that the Earth has been through many drastic climatological changes throughout history (ice age/no ice age), and there’s no reason to believe that these cycles will ever end. Homo sapiens, however, are imbued with more curiosity than nine cats in a burlap sack, endless opinions, and an unexplainable compulsion to measure everything in the known universe. Measuring the Earth’s temperature is a natural target.
So what tools do we use to measure the Earth’s temperature, and more important, who takes the readings, who keeps the records, and who analyzes the data? Before we can make an informed opinion on the subject of climate change, we must vet the data first. Certainly no one would argue that a forecast is only as good as the data used, and analysis of those data is absolutely dependent on the data that are included.
This brief but documented synopsis will allow us to understand where the main body of climate change information comes from and how it is handled on its way to being disseminated to the general public. It will also allow us to compare the methodology behind climate change research and private-sector temperature metrology and see how they stack up next to each other.
Stevenson screen
In 1890 the U.S. Congress established the U.S. Weather Bureau, which formed the Cooperative Observer Program (COOP). Scattered across the United States and manned by volunteers, some 5,200 COOP stations are set up to record and report surface weather conditions to the weather bureau. COOP stations employ a Stevenson screen to gather raw data (figures 1 and 2). The screen is a tidy box protecting a cluster of instruments such as a thermometer and barometer. The COOP program, utilizing Stevenson screens, is still a major source of information today.
Figure 1: Some COOP stations are set up like this.
Figure 2: Others operate in less-than-ideal locations that may skew overall trend charts, an issue that was studied by the National Climatic Data Center.
By 1997, the National Climatic Data Center (NCDC), part of National Oceanic and Atmospheric Administration (NOAA), had some serious questions concerning the adequacy and deterioration of the COOP data-collection efforts. Some of these concerns were a result of their own studies, and some came, apparently, as a result of NCDC investigating complaints made by some climatologists regarding instrument placement and data gathering. The NCDC addresses some of those questions on its frequently-asked-questions (FAQ) page.
Automated weather stations
A joint effort by the National Weather Service (NWS), the Federal Aviation Administration (FAA), and the Department of Defense (DOD) began in 1991 and eventually developed into the Automated Surface Observation System (ASOS). These are state-of-the-art, automated weather stations that automatically relay up-to-the-minute data 24/7, 365 days of the year, and add valuable surface-weather information to the COOP data set (figure 3). As of 2000, NOAA claimed 860 ASOS stations as commissioned.
Figure 3: An Automated Surface Observation System (image courtesy of NOAA)
Paleoclimatology
For climate information predating modern records, which are only available for about the last 150 years, scientists turn to paleoclimatology. Using a variety of sources such as tree rings, ice and Earth-core samples, coral reefs, and fossils, experts are able to form a picture of Earth’s climate history as far back as 5 million years.
Organizations graph and report on climate trends back to 1990, others back to 1880, still others go back 1,000, 2,000, even 1 million and 5 million years ago, depending on their particular viewpoint of what is relevant. That last bit is important. If you take the time to look at each of the links provided, you can see that the methodologies and data sets chosen do tend to align with the viewpoint of the author... or is it the other way around?
Late in 1999, several scientists working for the United Nations’ Intergovernmental Panel on Climate Change (IPCC) attempted to reconcile certain paleoclimatic anomalies as they prepared its annual report for the U.N.’s World Meteorological Organization (WMO). Their database was hacked, and internal e-mails revealed a struggle to manipulate data to correspond with the IPCC’s conclusions. The resulting climategate left an indelible stain on the subject of temperature measurement reports.
Satellites
The launch of NASA’s first weather satellite in 1960 gave scientists a whole new data set to play with. Between 1970 and 1976, NOAA’s logo graced five of what were deemed the improved satellites. NOAA has continued to improve its satellites’ capabilities to detect cloud, land, and ocean temperatures, as well as monitor the sun’s activities. Weather satellites are key tools in forecasting weather, analyzing climate, and monitoring weather hazards.
Figure 4: Temperature information via satellite (image courtesy of Gary Strand, National Center for Atmospheric Research)
We’ve looked at some of the data-gathering methods. Some seem fairly robust; some have been called into question. But all that data-gathering isn’t worth anything unless the data are handled properly. The main players in the data-gathering and -storing arena seem to be the Goddard Institute for Space Studies (GISS), NOAA, and the University of East Anglia’s Climatic Research Unit (CRU) in Great Britain. NOAA boasts one of the longest-running data records anywhere with NCDC's U.S. Historical Climatology Network (USHCN). The USHCN is a data set of monthly averaged maximum, minimum, and mean temperature, and total monthly precipitation. As NCDC describes on its FAQ page, raw data are adjusted to account for biases before they are entered into the USHCN database. For it's part, as can be seen on its updates page, GISS adjusts which data are included in its calculations to reflect the institution's ongoing changes in what it considers valid information.
It is here that many cynics begin to drool, and rightly so. Often, sometimes for reasons unexplained, the data have been adjusted. Sometimes, the adjustments make sense, such as some of the adjustment explained on the NCDC FAQ page. At other times, not so much. Sometimes, as pointed out by meteorologists Joseph D’Aleo and Anthony Watts in their paper, “Surface Temperature Records: Policy-Driven Deception,” no explanation has been given for the data manipulation.
No problem; just grab the data and look at them yourself. In many cases that is doable, and I would encourage doing so. However, when Roger Pielke Jr., a science professor at the University of Colorado pressed the CRU for raw surface temperature records, the university could not provide it, stating (paragraph 4) “Data storage availability in the 1980s meant that we were not able to keep the multiple sources for some sites, only the station series after adjustment for homogeneity issues. We, therefore, do not hold the original raw data but only the value-added (i.e., quality controlled and homogenized) data....”
To some, this might seem a minor thing. But imagine if your customer wanted you to supply the raw data you used as part of your quality assurance testing, and you had to tell them that it was lost.
As a good cookbook is to a cook, an accurate atmosphere-ocean general circulation model (AOGCM) is to those attempting to forecast climate change. AOGCMs are computer simulations built on parametrized input including, but not limited to, clouds, convection, carbon cycling, ice thickness, and ocean convection.
A models’ accuracy should be able to be verified by simulating historical climate periods. Meaning, input the raw data from a past time period, and see if your model comes up with output that matches known outcomes.
By 2008 there were obvious discrepancies between model predictions of the relationship between surface to tropospheric temperatures and the measured data. Robert J. Allen and Steven C. Sherwood of the Department of Geology and Geophysics at Yale address these discrepancies and suggest changing measuring methods (i.e., measure wind instead of temperature) rather than revising the models. Commenting on the discrepancies, NASA, whose satellites provide tropospheric temperature data, suggested instead, “A computer model is only as reliable as the physics that are built into the program. The physics that are currently in these computer programs are still insufficient to have much confidence in the predicted magnitude of global warming...”
So what does all of the above mean? Well, if you work in an industry that depends on accurate measurements, where your customers hold you accountable for the quality of the product they receive based on your own testing, then you might see some problems with how climate data are handled. If we are going to take global warming seriously, the data have to be treated seriously, and we have to have confidence in the scientists and institutions that are producing the data analyses.
Unfortunately, if we compare some of the methods employed in climate change research with those used in current private-sector metrology, the report card is nothing short of dismal.
To make such a comparison, let’s use environment chamber mapping (ECM) vs. climate change research (CCR) as examples. Unlike the Earth’s open atmosphere, chambers are meant to be controlled enclosures; however, the information required and the equipment used for data gathering are very similar. In any case, process methods share many quality control aspects regardless of the area of industry.
Data acquisition
ECM uses multiple sensors, so does CCR. If, however, an ECM company removed half of the sensors midtest and supplanted data from the remaining sensors to fill a predetermined placement grid, as happened in CCR (according to D’Aleo and Watts), that company wouldn’t be in business very long. At a minimum, wouldn’t we want to know exactly why that occurred, and what potential impact such actions would have on the accuracy of our system, whether it’s an ECM or a CCR?
Data storage
If an ECM company lost or destroyed the original data from a test, as happened in CCR, it would negate any further testing for that application. Again, I doubt if clients would be too willing to give repeat business. It’s a fact of life that data sometime do get lost or destroyed, but if you can't reproduce a model, scenario, or data analysis because the data are no longer available, then those data must be left out of any larger calculations, perhaps set aside for study purposes only. It may be a setback in your analysis, but I don’t believe that in all honesty you should be making claims that can no longer be substantiated due to loss of data.
Modeling
Reliable modeling is a cornerstone of chamber development. Without it many resources are unnecessarily wasted. If our theoretical ECM company suggested changing the metrics to explain why its model did not coincide with observed new data, its reputation would be diminished, to say the least. If it were discovered that an ECM company’s employees were collaborating to manipulate model reports to make its model results match previous known data (such as happened in climategate), I venture to say they would be fired.
Although climate change and global warming reports seem endless and appear to come from every institution imaginable, they all share one thing in common: The source of large chunks of their information are data that have already been adjusted by the National Oceanic and Atmospheric Administration’s NCDC.
The reports referred to most often in the news and online come from:
• National Oceanic Atmospheric Administration (NOAA)
• National Climatic Data Center (NCDC)
• United States Historical Climatology Network (USHCN)
• Global Historical Climatology Network (GHCN)
• International Panel on Climate Change (IPCC)
• World Meteorological Organization (WMO)
• Hadley Institute (HadCRUT)
• Goddard Institute for Space Studies (GISTEMP)
It may appear as if there are numerous independent sources of information, but many of the data crumbs lead back to NOAA again and again:
The NCDC is listed as one of the services prepared by the NOAA.
The USHCN has cited its principle investigators as NOAA personnel.
The HadCRUT report cites NOAA, WMO, and GHCN among its station sources.
The GISTEMP report lists the GHCN and USHCN as its first two sources for station data.
Even the IPCC and WMO use NCDC data, among others.
In other words, when you hear about a report from one of these organizations, you are really hearing about the same adjusted data over and over again. In many cases the raw data no longer exist, and yet these institutions continue to publish reports based on adjusted data. This appears to present a consensus among scientists, but if that consensus is in fact based on data served up by the multiple arms of one organization, then that organization’s reliability becomes essential to the research. We have to trust the data, how they were collected, and how they were analyzed.
The ramifications of climate change are too far-reaching to receive the short shrift of faulty processes. As professor Pielke puts it, “... because climate change is important and because there are opponents to action that will seize upon whatever they can to make their arguments, does not justify overlooking or defending this degree of scientific sloppiness and ineptitude. Implementing successful climate policy will have to overcome the missteps of the climate science community...”
Considering NCDC’s failing grades in data control and process management, it and its affiliated institutions may have corrupted the data and research badly enough to render the last 20-year data set worthless. Its handling of the data has, in many respects, hampered scientists from producing any kind of reliable long-range climate model.
And that’s no way to go about measuring anything.
Links:
[1] http://www.ncdc.noaa.gov/oa/hofn/
[2] http://www.ncdc.noaa.gov/cmb-faq/temperature-monitoring.html
[3] http://data.giss.nasa.gov/gistemp/graphs/
[4] http://earth.geology.yale.edu/alumni/Presents/Mann.pdf
[5] http://www.worldclimatereport.com/index.php/2008/02/11/a-2000-year-global-temperature-record/
[6] http://ossfoundation.us/projects/environment/global-warming/natural-cycle#section-4
[7] http://www.examiner.com/climate-change-in-national/climategate-climate-center-s-server-hacked-revealing-documents-and-emails
[9] http://data.giss.nasa.gov/gistemp/updates/
[10] http://scienceandpublicpolicy.org/images/stories/papers/originals/surface_temp.pdf
[11] http://sciencepolicy.colorado.edu/about_us/meet_us/roger_pielke/
[12] http://www.cru.uea.ac.uk/cru/data/availability/
[13] http://www.ipcc.ch/publications_and_data/ar4/wg1/en/ch8s8-6-2-2.html
[14] http://www.sciencedaily.com/releases/2008/05/080530144943.htm
[15] http://science.nasa.gov/science-news/science-at-nasa/1997/essd06oct97_1/
[16] http://www.climate.gov/#dataServices/noaaPartners
[17] http://cdiac.ornl.gov/epubs/ndp/ushcn/investigators.html
[18] http://www.cru.uea.ac.uk/cru/data/temperature/station-data/
[19] http://data.giss.nasa.gov/gistemp/references.html
[20] http://rogerpielkejr.blogspot.com/2009/08/we-lost-original-data.html