Congress May Probe Leaked Global Warming E-Mails

Email Print

A few days
after leaked e-mail messages appeared on the Internet, the U.S.
Congress may probe whether prominent scientists who are advocates
of global warming theories misrepresented the truth about climate

Sen. James
Inhofe, an Oklahoma Republican, said on Monday the leaked correspondence
suggested researchers "cooked the science to make this thing
look as if the science was settled, when all the time of course
we knew it was not," according to a transcript
of a radio interview posted on his Web site. Aides for Rep. Darrell
Issa, a California Republican, are also looking
into the disclosure

The leaked
documents (see
our previous coverage
) come from the Climatic Research Unit
of the University of East Anglia in eastern England. In global warming
circles, the CRU wields outsize influence: it claims the world’s
largest temperature data set, and its work and mathematical models
were incorporated into the United Nations Intergovernmental Panel
on Climate Change’s 2007
. That report, in turn, is what the Environmental Protection
Agency acknowledged
it "relies on most heavily" when concluding
that carbon dioxide emissions endanger public health and should
be regulated.

Last week’s
leaked e-mails range from innocuous to embarrassing and, critics
believe, scandalous. They show that some of the field’s most prominent
scientists were so wedded to theories of man-made global warming
that they ridiculed dissenters who asked for copies of their data
("have to respond to more crap criticisms from the idiots"),
cheered the deaths of skeptical
, and plotted how to keep researchers who reached
different conclusions from publishing in peer-reviewed journals.

One e-mail
message, apparently from CRU director Phil
, references the U.K.’s Freedom of Information Act when
asking another researcher to delete correspondence that might be
disclosed in response to public records law: "Can you delete
any emails you may have had with Keith re AR4? Keith will do likewise."
Another, also apparently from Jones: global warming skeptics "have
been after the CRU station data for years. If they ever hear there
is a Freedom of Information Act now in the UK, I think I’ll delete
the file rather than send to anyone." (Jones was a contributing
author to the chapter
of the U.N.’s IPCC report titled "Detection of Climate Change
and Attribution of Causes.")

In addition
to e-mail messages, the roughly 3,600 leaked documents posted on
sites including
include computer code and a description of how an unfortunate programmer
named "Harry" – possibly the CRU’s Ian
"Harry" Harris
– was tasked with resuscitating
and updating a key temperature database that proved to be problematic.
Some excerpts from what appear to be his notes, emphasis added:

I am seriously
worried that our flagship gridded data product is produced by
Delaunay triangulation – apparently linear as well. As far
as I can see, this renders the station counts totally meaningless.
It also means that we cannot say exactly how the gridded data
is arrived at from a statistical perspective – since we’re
using an off-the-shelf product that isn’t documented sufficiently
to say that. Why this wasn’t coded up in Fortran I don’t know
– time pressures perhaps? Was too much effort expended on
homogenisation, that there wasn’t enough time to write a gridding
procedure? Of course, it’s too late for me to fix it too.

I am very
sorry to report that the rest of the databases seem to be in
nearly as poor a state as Australia was
. There are hundreds
if not thousands of pairs of dummy stations, one with no WMO and
one with, usually overlapping and with the same station name and
very similar coordinates. I know it could be old and new stations,
but why such large overlaps if that’s the case? Aarrggghhh! There
truly is no end in sight… So, we can have a proper result, but
only by including a load of garbage!

One thing
that’s unsettling is that many of the assigned WMo codes for Canadian
stations do not return any hits with a web search. Usually the
country’s met office, or at least the Weather Underground, show
up – but for these stations, nothing at all. Makes me wonder
if these are long-discontinued, or were even invented somewhere
other than Canada!

how long it takes to debug this suite – the experiment endeth
here. The option (like all the anomdtb options) is totally undocumented
so we’ll never know what we lost. 22. Right, time to stop
pussyfooting around the niceties of Tim’s labyrinthine software
suites – let’s have a go at producing CRU TS 3.0! since
failing to do that will be the definitive failure of the entire

Ulp! I am
seriously close to giving up, again. The history of this is so
complex that I can’t get far enough into it before by head hurts
and I have to stop. Each parameter has a tortuous history of manual
and semi-automated interventions that I simply cannot just go
back to early versions and run the update prog. I could be
throwing away all kinds of corrections
– to lat/lons,
to WMOs (yes!), and more. So what the hell can I do about all
these duplicate stations?…

As the leaked
messages, and especially the HARRY_READ_ME.txt
file, found their way around technical circles, two things happened:
first, programmers unaffiliated with East Anglia started taking
a close look at the quality of the CRU’s code, and second, they
began to feel sympathetic for anyone who had to spend three years
(including working weekends) trying to make sense of code that appeared
to be undocumented and buggy, while representing the core of CRU’s
climate model.

the rest of the article

25, 2009

Email Print