Met Office code
John Graham-Cumming reports that the Met Office has published the code for preparing the land surface records.
This is slightly odd. What appears to have been released is the code for generating the CRUTEM land temperature index, which is actually prepared by CRU. However this does tally with what we know about the data the Met Office released the other day. This was, contrary to the impression given by the Met Office press release actually the corrected data which is used as input into the CRUTEM average and also the HADCRUT global temperature index. It's the latter index that most people are interested in.
If this is confusing you, I've prepared a summary of my understanding of how it all fits together. I'm not promising this is correct
I've made everything but the data and code released by the Met Office semitransparent. As you can see, what we are looking at are intermediates in the preparation of the global temperature index. While this is welcome, the guts of the changes are in the selection of the stations and in the correction of those stations for the plethora of problems with them - urban heat islands, changes in equipment, station moves, changes in observation time and so on. So while there is a feel of increasing openness, in reality, the shutters are only open the barest crack and it's still not possible to make out what's going on inside.
Meanwhile, even this extremely limited attempt at openness is not all it seems to be. John G-C has been looking at the code and running it against the data he has. What he has found is that prior to 1855 there was no southern hemisphere data and that when you run the Met Office's newly released code, this shows up as a gap in the graph of the average. But there is no such gap in the actual CRUTEM index. John's conclusion is that what we're looking at is not the actual code used in CRUTEM, but something written especially for public consumption. In light of the scorn that many programmers have been pouring on the quality of the coding standards at CRU, this might suggest that the original code was just too awful to make available for public inspection.
Reader Comments (26)
"...this might suggest that the original code was just too awful to make available for public inspection." Isn' the original CRU code in the now infamous HARRY_ READ_ME file? Seems strange that the Met Office should not have been aware of what CRU was using to produce HADCRUT. And is this why the Met Office announced a few weeks ago that they were going to recalculate the land surface temperatures? Is this alleged "new" code their first step in that process?
His words and plot show an absence of Southern Hemisphere data in the year 1855, not "prior to 1855".
Once, years ago, I had the unpleasant task of removing some termites from the rim joist of my house -- at least the contractor I hired did. First, he busted out the plaster covering the infected area, that exposed the infected wood which he probed with a steel bar, The termites came scrambling out in all directions, until I (to help reduce costs, I assisted him) strayed them with Diazon. Never felt so good as when I watched the little buggers who where trying to destroy my house curl up and die.
I hope you see the parallel. The question is can we spray them? NO? Oh, what a pity. They deserve it.
Merry Christmas to you Christians, Happy Chanukah to those of you who are Jewish (I know, it's over, but better late than never),. Happy Kwanzaa to those of you how follow it, and a very bad New Year for the AGW.
So while there is a feel of increasing openness, in reality, the shutters are only open the barest crack and it's still not possible to make out what's going on inside.
They have given a talking point for their supporters and the hoi-polloi masses. See we have released everything and there is nothing wrong.
As for "In light of the scorn that many programmers have been pouring on the quality of the coding standards at CRU, this might suggest that the original code was just too awful to make available for public inspection."
I wouldn't be too sure that is the reason. The fact that they have covered a gap shows they are fiddling with the data.
Hi,
I have posted elsewhere on this as below because I believe Quality Assurance is the Achilles heel of this AGW
movement. When they say show me your peer review we ask for the QA documentation- When they say the data is unequivocal we ask for the Quality Audit results and so on. Their software and data processing is all
schoolboy stuff and should be attacked on the basis it does not meet the Quality Standards the Government insists on from it's suppliers. This should be the major sceptic tactic.
"As an engineer with a lifetime of experience in electrical measurement I cannot understand that in all the discussion I
have not seen the words Quality Assurance mentioned. All of our measurements and instruments had to have traceability and customers would full access to inspect our labs and our measurements. Every engineering firm dealing with Government of Semi-Government had to have similar QA procedures. Now while I can understand
that it would be difficult to apply QA to scientific theory it appears to me that the measurement of temperature should definitely have QA procedures and documentation. These should be available on request from the authorities concerned as no Government purchasing body in Australia or other Western countries will accept product from firms without QA. It is totally inconcievable to me that Anthony Watts and his crew of amateurs should be required to perform basic site QA .
The fact that not only do we have measurement without any quality assurance, the very processes used are not divulged.
I think all the climate warmists need is a detailed independent Quality Audit and most global warming will disappear"
In regards to the bad code, didn't Jones et al say that they didn't want to release their code, etc to McIntyre because of all their blood, sweat and tears? If this questionable code is the best that they have, what the heck have they been doing? It looks more like their hard work went into "influencing" public perception with AGW than in proving it out.
The chart is very helpful for us non-scientists. Perhaps you can extend it to include the US data sets pre/post pasteurization as well?
Is the code the sort of product one would expect from a
- well educated
- reasonably intelligent
- hard working
- honourable
- everyday kind of person?
Or does the code not quite meet that standard?
Is there no record of the date of each line of code or the author or update identifier (the basic requirement of QA)?
The gap at 1855 is very intriguing.
In the first instance, might we now say that CRU 'hid the gap'?
In the second instance - are these the raw data gathered by the sips ot the Royal Navy ( that's how I understand your graph, above)? if so - what happened?
The main point of course is that neither of these two institutions mention the gap, but just slither over it.
Hiding gaps in raw data,and not explaining that one has a gap and why, is a strict no-no where I come from ...
In light of the scorn that many programmers have been pouring on the quality of the coding standards at CRU, this might suggest that the original code was just too awful to make available for public inspection.
As an IT Consutlant with QA experience it would be only human nature to clean and tidy up the code before publishing. Programs evolve with different authors. I am not defending it, but due to the public scrutiny no Manager would willingly allow anything out that is not "pleasant on the eye". It does not necessarily mean anything untoward. It is just human nature.
However, the released version of the program must be SHOWN to have produced identical results to any untidied version - that is a relatively simple exercise. Unfortunately relatively simple exercises seem to have been beyond CRU so far in their ramshackle development processes.
Sorry hit the post button too soon....
Unless we know otherwise though we should not paint the MET office with same brush as the CRU. The MET office SHOULD have better development processes in place that CRU. Academics are not developers (though they think they are). I am guessing a fair proportion of the MET office staff are development staff.
I am baffled. The BBC repeat the Met Office claim that the last ten years have been "the warmest on record". This is the opposite of widespread skeptic claim that there has been no warming since 1998. No two statements could be more contradictory.
CRU was clearly full of "availability scientists" - proofs constructed while you wait. But what of the Met Office? Are they just much smarter at hiding things? What are the names? The Met Office is the main employer of climate scientists. Forget CRU, vain and incompetent academics, they are toast. We need to nail the Met - they are the professionals, the KGB of Climate Astrology.
AndrewSouthLondon
I'm not necessarily agreeing with or supporting the Met office claim regarding the last decade being the hottest, but keep in mind no warming and being the warmest decade are not necessarily contradictory claims. For example, if 2000 was the warmest year recorded, and the next 10 years were the same temperature as 2000, then it would be the hottest decade without there being any overall warming.
So the only data released has already been 'adjusted'. Hardly open.
Your diagram shows a disconnect from CRU once the adjusted data is sent to the Hadley Center. Is that the end of the CRU's involvement in HADCRUT?
I have read a number of the Climategate emails (though I have only read links in articles to specific emails) but are many addressed to or from Hadley Center people? I would have thought that there would have been a fair bit of correspondence between the two to do with HADCRUT.
[BH adds: I think that's right. CRU only deals with the land temps. I don't recall significant quantities of email re HADCRUT. Worth checking though]
AndrewSouthLondon
Both statements are correct (at least based on the average temperatures published by HADCRU and NASA GISS - even if the code from both is reliable, there are issues with high lattitude station drop out after 1990 particularly in both Siberia and Canada and hence representativeness of the dataset used), and are comparing apples with oranges - a trick that both the arch-warmists and rampant deniers both use far too often
The 90s generally showed a warming trend until 1998, which was an exceptionally warm single year as a result of a very large El Nino, releasing much heat energy from the mid depths of the central to south Pacific.
1999 and 2000 were somewhat cold as a consequence of La Nina. From 2001 onwards, temperatures have pretty well plateaued at a level higher than the 90s average but lower than 98, so giving the noughties a higher 10 year average. Statistically (check out Lucia's Blackboard), the monthly anomaly data from 2000 or 2001 show no significant trend (depending on the start date and dataset used, there can be either a marginally positive or marginally negative trend, but nothing to pay much attention to).
So, they've released the sexed up data? How very Tony Blair of them. What's the betting some years down the line they'll follow Tony's lead and say 'Whatever the evidence was I would have believed what I believe anyway'?
Speaking of code, I was pleasantly surprised to read this, at Free Software Magazine. Richmond makes a good point, too: for all the CRU's bluster about “science as usual”, other disciplines are more open with their work.
Jiminy Cricket
The AGW movement has been afforded too many "..it is just human nature" concessions at this point.
Every point in their endevaours -
for their activism instead of a disinterested approach
for their sphagetti code instead of a professional product
for their conspiring against MMs instead of working with them
for their trying to shut down others' papers instead of minding their own business
The excuse given by someone at some point for every wrong thing the Team has done "...it is only human". Science is not done like that.
The code up at the MET Office does not appear to be the code used by CRU:
http://strata-sphere.com/blog/index.php/archives/11957
I have been involed in QA for finacial trading software for over 17 years and it should have been very easy to QA these programs, alot of hard grunt work but easily done. I have never once looked at the actual code to QA its effectiveness but have always been able to test its relability based on the description of what it is supposed to accomplish. You take a raw data point, manually calculate what the code is supposedly doing and compare the manual results to the automated results. QA doesn't prove that the methods or reasons for adjustments aren't justified or valid but we can prove the codes does what they claim. Up until now we have had to ouiji board out the methods by trying to decipher the code and reverse engineer their methods.
My bet is that they cannot reconcil their methods with accepted practices and are trying to hide it behind general terms i.e. homogonizing, etc.
that should "financial software"
The Met office did promise the UK a babeque summer and a mild winter. They usually get the weather forecast spot on two days ahead but it would appear that longer term their forecasts are no better than guesses. This is one of the reasons that I am sceptical of those who claim to be able to predict the climate even decades into the future.
Thxs for explanations. Seems the only honest position is that we don't really know if the climate is warming or not. Anyone who says anything with any certainty is almost certainly lying. The BBC/ Met Office push the "warmest decade on record" line as an undisputed truth, and so we pay to be lied to. The same logic by which in Stalin's Russia, political opponents were shot and their families then asked to pay for the bullets.
Anand Rajan KD
For someone involved in software development for 27 years I have to say that tarting up your code before others see it is normal human nature. No different than combing your hair in front of the mirror before you go out on a Saturday night as a teenager. I know if I was the programmer I would review such a program before release and make sure it was not embarrassing to my peers.
This is why Open Source software is so powerful, to a generally better quality with less errors. And doing a regression test on a fixed set of input data is not rocket science.
To ALWAYS ascribe conspiracies to every action to every organisation promoting AGW is not a wise path to follow. It can lead you down some dead end with a lot of expended and wasted energy.
Please go to PBS here and post a comment regarding Jim Lehrer’s interview with Obama
http://www.pbs.org/newshour/rundown/2009/12/excerpt-obama-on-disappointment-in-copenhagen.html