The biggest climatology scandal yet—the data is gone

(Note: be sure to see my exchange with Mr. Zarkov on the meaning of adjusted data versus original data in climatology.)

A. Zarkov writes:

As discussed in today’s Sunday Times, it seems that the East Anglia Climate Research Unit (CRU) has “lost” the original temperature data which were the basis of their climate history and climate projections. Without the original data, scientists will have great difficultly in checking the software and verifying that the adjustments to the data are correct. If they have thrown away the original paper records from the temperature stations, then it’s possible they can’t ever verify the adjustment calculations. Then it will come down to a “trust us” to have done everything correctly. I can tell you from personal experience that real scientists never throw away data. They’re like pack rats. If for some reason they wanted to get rid the paper, then they should have microfilmed the paper records. Surely they must have had the original temperature data on magnetic tapes, or punched paper tape or even on the old IBM punched cards. Why throw that away? There’s simply no excuse for such incompetence. The English are normally very good about keeping records such as birth registries for many hundreds of years. We must also face the possibility that the discard was deliberate so as to make it impossible to determine if the adjustments were bogus. [LA replies: I think the disappearance of data is so extraordinary that it doesn’t point just to a possibility that the disappearance was deliberate, but to a very high probability that it was deliberate.]

[LA correction: The original data were thrown away in the 1980s, when global warming was not the powerful ideological movement it is now. This makes it less likely, though still possible, that the data dump was deliberate. Either way, the fact remains that in the absence of the original data, it is impossible to determine whether the adjusted data is correct. Which means that the entire global warming mansion is a house of cards.]

The US Historical Climatology Network archives original as well as adjusted data from continental temperature stations going back to the beginning of the 20th Century. This data is on-line and available for public access. Why can’t the CRU do the same thing? The adjustments are extremely important as we can see by this graph, which is the difference between the adjusted and the original temperature readings. Note that the whole of the alleged warming comes from the adjustments. I don’t understand why the adjustments are always positive and increasing after 1960. One would think that the “heat island effect” would have called for negative adjustments. The term “heat island” refers to the artificial elevation of the recorded temperatures as the area surrounding the stations becomes urbanized.

The British press seems much more on top of this story than the U.S. press. In their zeal to protect and advance Obama’s agenda, the U.S. press keeps trying to downplay the significance of the leaked documents and emails. Normally the New York Times loves leaked documents when the leak threatens U.S. national security. They have no hesitation about printing stolen classified material such as the Pentagon papers. It you really want to learn about what’s happening with Climategate then one must go to the blogs. The MSM is virtually useless in the U.S.

- end of initial entry -

LA replies:

Can you explain what is meant by adjustments versus original data?

A. Zarkov writes:

In the mid and late 1980s the U.S. Weather Service changed the temperature sensors it uses in its Cooperative Station Network. Before the change, they used a max-min type thermometer similar to the one that you can buy in a hardware store. It’s a U-shaped tube with mercury that sloshes up and down like this one. After the change they used an electronic temperature sensor in a completely different kind of housing. The meteorologists found a systematic difference between the two different kinds of sensors, so they adjust the data to make all the records homogeneous. VFR readers who want to drill into the details should read this article. One might ask: which sensor is more accurate? As one can see from the linked article, the meteorologists don’t know, or didn’t know when the article was published. All this leaves me unsettled. I’m not sure what they have done is even correct. But in any case we know a number of adjustments are made to the raw sensor data to make all the records comparable. It’s a little like adjusting income data for inflation using the Consumer Price Index.

LA writes:

In an unposted comment you mentioned another type of adjustment that is necessitated when a weather station moves to a new site. Could you explain this further?

A. Zarkov replies:

Moving to a new site means the geographical location of the weather station is changed. I would consider that establishing a new station, but that’s what they operate. Note: from the Historical Climatology Network web page,

“The stations were chosen using a number of criteria including length of period of record, percent missing data, number of station moves and other station changes that may affect the data homogeneity, and spatial coverage.”

and

“The homogeneity adjustment scheme described in Karl and Williams (1987) is performed using the station history metadata file to account for time series discontinuities due to random station moves and other station changes.”

The new site might be at a different altitude and that generally means lower temperatures, so they add in a correction for a station move. I’ll guess that a weather station is associated with some geographical region, and a station get moved around within its associated region.

LA replies:

So (I hope I’m getting your meaning) the readings that a station gets of an unchanging location (say, the atmospheric temperature at Harrisburg Pa.) is relative to the changing location of the weather station itself, and this has to be adjusted as per each weather station location.

What this suggests is that virtually all the data they use in their climatological histories and projections are not original but adjusted. And the adjustment is done by various calculations and estimates which are based on various models and could themselves be wrong.

Can there then even be an objective basis for climatological predictions?

A. Zarkov writes:

Mr. Auster remarks,

“What this suggests is that virtually all the data they use in their climatological histories and projections are not original but adjusted. And the adjustment is done by various calculations and estimates which are based on various models and could themselves be wrong.”

Exactly. Here’s an another example. The New Zealand Climate Coalition (an anti-Global Warming group) accuses New Zealand’s National Institute of Water and Atmospheric Research (NIWA) of creating a temperature trend through bogus adjustments. The coalition presents two graphs. One is the official plot with adjustments, and the other is the unadjusted temperature record. The unadjusted data show no warming effect for New Zealand. See here.

I don’t agree with the Coalition. The data does need adjustment for legitimate physical effects. But the adjustment process must be absolutely transparent and subject to independent audit. It must also be based on sound mathematics and physics. Let’s remember they are trying to detect about one degree centigrade change per century! That’s not much. Even a very small, but universal sensor bias could invalidate the whole business. At this point, I’m not sure the data are accurate enough to detect such a small change for the whole world. I have to look at the data. I’m not particularly impressed with the climatologists’ statistical skills. And in this I’m not alone. Professor of statistics at George Mason University says the same thing. See this National Post story. Incidentally Wegman is the author of a Congressional report debunking the so-called “Hockey Stick” graph from Michael Mann. Mann appears frequently in the Climategate emails.

Climategate is changing so fast, I can’t keep up. There is little doubt this whole enterprise has broken down into a left-right fight. Today the New York Times attack dog Krugman has announced that the emails mean nothing because they are just informal chit chat between scientists. Has he not heard of what’s in the document file? Krugman looks really bad on this one.

November 30

A. Zarkov writes:

Canada’s National Post newspaper ran a 27-part series of articles on climate change critics. These articles appeared 2006-2007, so they don’t include the latest information, and of course nothing on Climategate. But the series does provide an excellent overview of the case against global warming for the general reader. I especially like the first one: Statistics Needed. The whole series is available here. VFR readers who want to get the background to understand the Climategate controversy are well advised to read all 27 parts.

I’m really angry about the letter to the U.S. Senate in support of global warming signed by the president of the American Statistical Association (ASA). You can read the letter here. In his report to Congress Professor Wegman (see above National Post article which features him) points out that statisticians have been largely shut out of the whole global warming enterprise. Most of the statistics has been done by non-statisticians, and done badly. Yet the ASA president signs a letter of support! Did the ASA president not read the Wegman report? I can only conclude the rot has penetrated deeply into American institutions of science. I am going to raise a stink about this with my colleagues.

* * *

Here is the London Times article that Mr. Zarkov linked:

November 29, 2009
Climate change data dumped
Jonathan Leake, Environment Editor

SCIENTISTS at the University of East Anglia (UEA) have admitted throwing away much of the raw temperature data on which their predictions of global warming are based.

It means that other academics are not able to check basic calculations said to show a long-term rise in temperature over the past 150 years.

The UEA’s Climatic Research Unit (CRU) was forced to reveal the loss following requests for the data under Freedom of Information legislation.

The data were gathered from weather stations around the world and then adjusted to take account of variables in the way they were collected. The revised figures were kept, but the originals—stored on paper and magnetic tape—were dumped to save space when the CRU moved to a new building.

The admission follows the leaking of a thousand private emails sent and received by Professor Phil Jones, the CRU’s director. In them he discusses thwarting climate sceptics seeking access to such data.

In a statement on its website, the CRU said: “We do not hold the original raw data but only the value-added (quality controlled and homogenised) data.”

The CRU is the world’s leading centre for reconstructing past climate and temperatures. Climate change sceptics have long been keen to examine exactly how its data were compiled. That is now impossible.

Roger Pielke, professor of environmental studies at Colorado University, discovered data had been lost when he asked for original records. “The CRU is basically saying, ‘Trust us’. So much for settling questions and resolving debates with science,” he said.

Jones was not in charge of the CRU when the data were thrown away in the 1980s, a time when climate change was seen as a less pressing issue. The lost material was used to build the databases that have been his life’s work, showing how the world has warmed by 0.8C over the past 157 years.

He and his colleagues say this temperature rise is “unequivocally” linked to greenhouse gas emissions generated by humans. Their findings are one of the main pieces of evidence used by the Intergovernmental Panel on Climate Change, which says global warming is a threat to humanity.

December 3

A. Zarkov writes:

Regarding your correction in the initial entry concerning the fact that the data was dumped in the 1980s:

While the Climatic Research Unit (CRU) is the primary repository of the world’s temperature data, it’s not the only one. We also have the NASA Goddard Institute for Space Studies (GISS) and National Climatic Data Center (NCDC). Both these repositories have their original unadjusted temperature data, There is tremendous overlap among the CRU, GISS, and NCDC because each gets data from the others. We also have repositories in Australia and New Zealand who have also retained their unadjusted data. At worst some original British temperature data might have been lost, but doubt even that. It should only take a small effort for CRU to restore the lost data, Unless somehow the original paper records were destroyed. But I doubt the British Meteorological Service (or whatever it’s called) would have released it’s paper records, and most likely has it all in digital form.


Posted by Lawrence Auster at November 29, 2009 07:20 PM | Send
    

Email entry

Email this entry to:


Your email address:


Message (optional):