Thursday
Apr012010
by Bishop Hill
Russell review submissions
Apr 1, 2010 Climate: CRU Climate: Russell
Sir Muir and his band of merry men have published some of the submissions they have received. Included among them is a letter I wrote to Sir Muir after the resignation of Philip Campbell, suggesting a way by which his panel could appoint a mutually acceptable replacement. I didn't received a reply to this, or to a subsequent email. It's rather odd to see it now listed as a submission to the panel.
Reader Comments (48)
Bish: Did you not provide an actual submission?
I can't see the submission by Mann that was supposedly made at 1 minute to midnight on the last day of submissions.
I've 10 years in IT telco/banks, and a MSc information systems..(1st degree chenistry)
The government standards for software developement, project management and quality control , audit are copious and unbending,
Yet climate scientists fiddle around in their labs, a phd student with a geography degree, wrote a lot of what Harry was trying to sort out (tim's labyrithine suites)
So an extract from a sunmission, ref the coding standards, shows what a farce CRU code was:
http://www.cce-review.org/evidence/Bratby.pdf
Submission of P A W Bratby to the Independent Climate Change Email Review
1. My submission is based on my professional experience from over 35 years of
work in a scientific/engineering environment coupled with a detailed review of the
CRU emails/data in relation to my experience.
2. I was involved in a review and commentary of the emails. The findings from the
perspective of working physicists are given at:
http://scienceandpublicpolicy.org/reprint/climategate_analysis.html
3. The following are my main impressions of the scientists and the work at CRU.
4. There is a complete lack of professionalism at CRU. The scientists were
attempting to do work for which they were not SQEP (suitably qualified and
experienced persons). It is evident that the main areas of expertise required to
construct global climate dataset (e.g. CRUTEM) and climate histories are statistics
and software development. Both these skills are lacking at CRU. The major
backgrounds appear to be in obtaining field data in the areas of dendro- and paleoclimatology
without any understanding of the physics of the climate. There was
no attempt to obtain people with the correct skills or even to seek assistance from
such people.
5. There is no evidence of the application of even a most basic quality management
system. There appears to be no policy in the areas of data control, data archiving,
software development, software documentation and software configuration and
control. There is no evidence of an archiving system or of the use of International
standards such as ISO9001 (for business processes) and TickIT certification (for
IT standards) to manage the work.
6. The work of the CRU scientists passed from academic science into recommending
society-wide economic and industrial engineering through the influence of the
CRU involvement in the IPCC. As an engineering project affecting society at
large, a transition the CRU scientists voluntarily promulgated through their
participation in the IPCC, their work should no longer have been exempt from an
engineering-quality validation effort.
Philip
No, I ran out of time. But frankly, having had no response to my emails, I took the hint.
Sir Muir Russell said
This is just the first cohort of submissions - we will continue to upload others onto the website as soon as is practicable.
I want to take this opportunity to thank everyone who made a submission for their contribution. The Review team has read and noted all submissions.
Come on, come on what's the hold up Sir Muir you have had the submissions at least a month?
I see a recommendation for your book on one of the submissions Bish.
Impressive submission from McKitrick: http://www.cce-review.org/evidence/McKitrick1_61.pdf
Barry Woods: Thanks for posting some of my evidence. You were quick off the mark to read it!
I read (most of) Harry_Read_me.txtx on the 20th November 2009..
For the next 3 weeks I was commenting everywhere, forget the emails, look at the code, look at harry readme.txt. and forget who harry is, TIM wrote IT.
It took me about 30 minutes to find out exactly who Harry and Tim were.
Tim had left remember, why could not harry ask him...
Tim was off studying at theological college. and he is now a minister.
Anyway, is greenwash stronger than whitewash?
Where is an investigative journalist when you need one?
Lots of rhetorical questions!
:Extract from another submisson:
http://www.cce-review.org/evidence/Mims.pdf
(Har2. The HARRY_READ_ME.txt document
The released CRU document HARRY_READ_ME.txt is especially worthy of careful
review, for it illustrates the questionable nature of at least some of the scientific
methodology employed at CRU. Data products resulting from this methodology were
apparently published or requested under the Freedom of Information Act.
One of the many troubling lines in this document, which was written by a CRU
programmer from 2006 to 2009, reads, “It’s Sunday evening, I’ve worked all weekend,
and just when I thought it was done I’m hitting yet another problem that’s based on the
hopeless state of our databases. There is no uniform data integrity. ...”
This and many other frustrated comments by “Harry” were written during a time when
some of his CRU colleagues were confidently asserting the validity of their data and
ridiculing the findings of those with whom they disagreed.
Authority: Paragraph 1.1 of the Issues for Examination states that "other information
held at CRU" will be examined.
Reference: HARRY_READ_ME.txt can be found at:
http://www.anenglishmanscastle.com/HARRY_READ_ME.txt For some of the more
troubling aspects of this document, search on "rg" to find numerous variations of
"Aarrggghhh!” and then see the associated text.
I've looked at the enormous CRU submission and came upon the following little gem:
"It is unreasonable to expect that every version (including interim versions) of every one of these diverse datasets and analyses, the product of almost forty years work, could be retained by CRU."
That is complete nonsense and demonstrates the amateur behaviour of CRU.
I would say "It is normal practice within a professional organisation (and very cheap and easy) to archive every version of every dataset and analysis"
OUCH!
http://www.cce-review.org/evidence/StephenMcIntyre.pdf
I mean , really really OUCH!!!
Astonishingly, the Team website says that they have not read all the email exchanges in the
Climategate dossier.
We have read a substantial number of the hacked emails, but not yet all.
Many blog readers have read all the email exchanges in the dossier; the failure of Team members
and staff to do so is dispiriting, to say the least. A fortiori, since the Team has not even read the
complete email dossier, they have obviously not scratched the surface of the balance of their
remit – to examine “other relevant email exchanges and any other information held at CRU”.
The Team’s lack of familiarity with the content of the emails is evident in the incompleteness of
the questions in the Issues Paper, as well as their frequent and almost embarrassing tendency to
miss the point. It was further evident in Dr Boulton’s interjection at the Inquiry press conference
where he was completely confused on the difference between the email containing the notorious
request to delete the emails concerning AR4 and the equally well known email threatening to
keep certain publications out of AR4. Given that hundreds, if not thousands, of blog readers
know the distinction, it is, to say the least, embarrassing that the primary author of the Issues
Paper was confused on the matter."
I wonder if the BBC will report this, or the Guardain?
I did not notice any submission by Michael Mann. Yet I have a recollection of Sir Muir stating that he had received a submission from Michael Mann - just a few minutes before his deadline. Did I just imagine it?
78 page submission from CRU, all prepared in 2 weeks!
Barry:
You couldn't be more right. And it's demonstrably true that they have considered themselves exempt in the most cavalier way in the past.
The defence being offered, inevitably, is that none of the source code leaked was used on data they actually released. This should be a major focus of the Muir Russell inquiry. But as I said yesterday:
What this means of course is that CRU is protected by its own incompetence. And what Steve McIntyre has powerfully argued in his submission released today is that they are now being protected by the incompetence, or worse, of Muir Russell, Geoffrey Boulton and the first UEA inquiry, just as was true in the most part of the parliamentary inquiry.
However, respected IT people around the world with wide experience and expectations of open source and related quality standards are waking up to how abysmal the situation in climate science has been. After I dropped in my submission, touting the 'Open Climate Initiative', to the parliamentary inquiry in Westminster with an hour to go on 10th February I went along to the London Ruby User Group in the evening, which I would consider a good barometer of the modern open source movement. In the pub afterwards I talked to a number of people, including two speakers. Everyone was astounded by the climate science situation as it has been revealed since Climategate.
What Richard North wrote a while back about the climate of opinion having shifted irreversibly is I believe true. But we need to follow through in holding these lamentable attempts at damage limitation, often called inquiries, to the mark. Steve McIntyre has once again done a wonderful job in that regard.
"It is unreasonable to expect that every version (including interim versions) of every one of these diverse datasets and analyses, the product of almost forty years work, could be retained by CRU." = FAIL
I completely agree with Phillip Bratby who says: "It is normal practice within a professional organisation (and very cheap and easy) to archive every version of every dataset and analysis""
Teenage coders on the web know this - but apparently not the scientific mandarins of the CRU.
The Matthews submission is devastating and well argued, at a level which even these panel members will find hard to misunderstand.
Will be interesting to see if Mann emulates Bradley and Hughes re references to Jones and Briffa. The Ray Bradley submission (No 19) says Briffa "has been an active and insightful participant" in the divergence debate and only refers to Jones in the opening para. The Malcolm Hughes submission (No 26) has 16 references to Briffa (excluding to other documentation) and only 1 for Jones. If Jones is feeling "awful", as per Spiegel, this use of the Harry Potter trick - "He Who Must Not Be Named" - wont help hide his decline.
Martyn said
I think you've got the sequencing wrong. The CRU submission was written first and the question headings then extracted and published for others to answer in the question period.
If you look in the document properties for both you can see that the editor/author was Elizabeth Meriwether Gilmer.
Knowing I'll be torn to shreds I would like to comment on the code development furore.
First my confession. I worked in climate science in my earlier career. I used to write and operate data processing software. I developed physics models and my work product was used in important social decisions - such as approving uranium mine development.
I not only wrote the data processing software but often went out and collected the original data, calibrated the instruments and designed monitoring networks.
When I got the data I had invented my own time series database systems (similar to geophysical 'stack' files) and stored the data after processing. It eventually got stored on tape for later retrieval.
Later in my career I worked in economics modeling and then a whole host of commercial enterprises where I managed teams of ISO qualified developers on large scale projects where dollars really counted.
So I can say I have seen both sides of the coin to far greater depth than I would ever want to again.
My first and very important point is that in climate science it is scientists who are doing research and they are using computer software to help get results. Climate scientists aren't programmers - but then again computer programmers aren't climate scientists.
Computers are nothing more than one of many tools to help a climate scientist.
Climate scientists work with the data to get the results for their papers. In the olden days they worked with pencil and paper - and I'd have to say the vast majority of good science in every field was done before computers intruded.
So the scientists know the data. They know what they want to get out of it. They know how to spot an error and they have no problem working through issues such as bugs to get to where they are going.
As an example in another field. The vast majority of spreadsheet developers are incompetent nincompoops. Really! I have never ever seen a spreadsheet without some form of error. And this is in areas where it really costs to make a mistake - such as planning the national budget or designing an aircraft.
It is accepted that people use spreadsheets - but it is also accepted that there will be errors. It relies on the skill and experience of the person using the tool to know what is right and wrong in the result.
If the ISO crowd have their way then every spreadsheet would have to be specified in exact detail. The code entered by professional developers and formal test suites run.
I imagine this may actually happen in very rare circumstances but in general it doesn't. And the reason is that most spreadsheets are tools to help someone do a job. The same as a data processing package developed by a climate scientist is a tool to help them do their job.
I'll give an example from real-life. I was engaged as management consultant for the Defence Department of a certain English Speaking Country. The job was to assess the logistical forecasting processes of the Army, Navy, and Airforce. Logistics being rations, bombs, bullets, that sort of thing rather than Battleships and Tanks.
There were three different systems in place. The Airforce had a massive ISO certified system that could forecast the procurement requirements down to individual washers for routine maintenance 5 years in advance. They had hundreds of people working in the area maintaining the data. The Army had a dozen different small programs and spreadsheets for for forecasting requirements and a small staff - maybe 10's. The Navy - I kid you not had a single spreadsheet maintained by a Chief Petty Officer.
The difference in complexity was staggering. The difference in effectiveness was minimal. Each system worked well. They never ran out of bombs and bullets and they were perfectly happy with what they had.
In all cases there was a body of knowledge of the people working with the data. In the airforce case personal knowledge was relatively unimportant. In the Army case more so, in the Navy experience was critical.
Now in the climate case the same thing applies. It's the scientist who is important, not the computers, software, code or whatever. And no amount of ISO shenanigans is ever going to affect that.
Agreed you can have bad scientists, but you also have many good scientists. Sure they'd like software better tools and code, but sure as hell they don't want to go down the path of waiting three months for a minor change in some code to be specified, analyzed, coded and tested.
Surely the point is that in climate science, the models *are* the science. Every observation, whether it is temperatures or sea-level from satellites, has to be adjusted in some way.
In many case, that requires the scientist/programmer to make assumptions. And it is those assumptions, which inform the programming effort, which are being challenged as much as the accuracy of the code itself.
Rick Bradford said
I'm afraid I have to disagree. The models are an oversimplification of physical processes that are understood by the scientist. The scientist should devise experiments that test the results of the models and use that to improve the model or rectify faults in the physics model or the coded implementation.
Also, to briefly summarise my long post above:
Good code can't make a bad scientist good.
Bad code can't make a good scientist bad
Having read all the submissions released, I agree with ZT that Ross McKitrick's submission (No 15) is outstanding. Its clarity is in sharp contrast to the victim waffle of the CRU submission (No 05).
Of particular irony is McKitrick's analysis of the non-availability of CRU's adjusted data when CRU's response to:
Q 4.4 Has this been an orderly and objective process applied to all datasets?
includes the statement "We have recently gained access to 475 stations from Russia. We will be looking at these data to determine if they are an improvement on the stations we have. This will take some time as, at present, there seems to be no documentation as to how adjustments (that are evident following initial comparisons with our data) have been made."
It is also interesting to see how the submissions from both CRU (No 05) and Bradley (No 19) use the same its-so-complicated-you-wont-understand-it response to:
Q 2.3 How important is the assertion of "unprecedented late 20th century warming" in the argument for anthropogenic forcing of climate?
CRU - last para of response p33 and a miracle of double negatives -
Warmer temperatures in the past, if they were caused by forcing mechanisms that can be shown to be unable to explain the current warming, would not affect the attribution of the current warming to anthropogenic causes. For example, Goosse et al. (2006) explore the possibility that a combination of land-use changes and natural changes in the Earth's orbit, in solar irradiance and in volcanic activity might have caused relatively warm summer conditions in Europe during the MWP. Such changes are not able to explain the current warm conditions, especially in winter. This question moves away from the main focus of this review - the issues and allegations that arose from the hacked emails - and is really a scientific question for the broader scientific community.
Hadley p2 -
The argument for anthropogenic forcing of climate is based on our understanding of radiation, atmospheric physics, feedbacks and model simulations which provide a multi-dimensional fingerprint of the changes expected with increased levels of greenhouse gases. Late 20th century warming is an expected, and well-observed, consequence of the increase in greenhouse gases that have been measured. No other forcing can explain the warming observed in the late 20th century. As noted earlier [bold on] this question has no bearing whatsoever on your brief [bold off] to examine, ".. whether there is any evidence of the manipulation or suppression of data which is at odds with acceptable scientific practice ...."
The submissions, not surprisingly, fall into 2 distinct camps.
One side is notable for its constant refrain "have known Jones et al for many years, have the greatest respect for their scholarship, insight and scientific integrity", "have been open with the data and methodology" along with some general remarks about proxies and criminal hacking of e-mails.
The other side cluster around the idea: "here's what the individual e-mails said, showing these scientists' own words as to what they did."
McKitrick's submission is a tour-de-force, and utterly compelling. He delivers a full chronology of each point which has been raised, splicing in e-mails from Jones et al to show them in context.
If the panel members will just read McKitrick's submission, Jones is sunk without trace and the AGW myth with him.
I note that many of the AGW camp who support Jones (see Rick Bradford comment) for his integrity etc also use the argument of ignorance "No other forcing can explain the warming observed in the late 20th century." e.g. Bradley.
Response to the argument of ignorance: firstly we don't know what the warming really is due to data manipulations etc and secondly we don't understand all the climatic processes.
It really is pathetic to rely on the argument of ignorance.
The more I read what these people write, the more I get the feeling they are pretty incompetent and are not scientists.
Following on from my previous comment, not surprisingly, Hughes uses the same argument of ignorance "No other forcing can explain the warming observed in the late 20th century."
Wonder if Mann will when his evidence gets posted?
Reading Ross Mckitrick's submission again, para [89] is devastating:
"I submit that evidence sufficient to disprove a claim of fabrication would consist of the p value
supporting the claim of statistical insignificance made on page 244 of IPCC Working Group I, the peer reviewed journal article in which it was presented, and the page number where the study is cited in the IPCC Report. I request that the Inquiry ask Dr. Jones to produce these things. An inability on his part to do so would, I submit, establish that the insertion of the paragraph quoted above at paragraph [83] amounted to fabrication of evidence, with the effect of concealing problems in the CRU temperature data upon which some of the core conclusions of the IPCC were based."
Imagine if Mckitricks submission was sent to every scientist in the world.
Oh that can happen (the internet)
I do not see how, even 'when' they 'cleared' to go back to work, that Jones, Briffa, 'Harry', etc,etc will ever work in climate sciense again... Would any other researcher (given mcintyres, mckitricks and particulary the computer science criticism of the code submission) want to be associated with them..
Sadly, ANYBODY with a UEA degree in the future may have problems in the job market..
CRU and by the actions of the vice chancellor, etc (Acton ?) will have damaged the creibility of al science from that institution.
They will be teaching students about this in the future.
A quote from a member of the public, elsewhere: (a car forum: pistonheads)
[quote=kiteless]A thought:
Climate change is a natural occurrence, whether we like it or not. We are being told to "fight climate change", ergo we are fighting to stop a naturally occurring event.
This is not a good thing, surely?
[/quote]
This is where the BBC has a DUTY to inform, and explain
When the bbc says climate change it deliberately mixes it up with man made climate change... (the rest of the media is just as bad, but they are a bit dim - ie gmtv below)
Does the BBC think every single natural process that has caused climate change (naturally) throughout human history. deserts come and go, sea level rise and fall, glaciers expand/retreat, sea ice grows/shrinks, wam periods, little iceages, etc,etc.
Let alone pre/history - ie the previous 4 billion years, before humans approx 6,000 year blink of an eye (ie written records) Cyclical Vast timeframe iceages (that some proces, process caused to MELT)
Only the insane would deny 'climate change' (which is how AGW advocates try to paint the sceptics.)
ie gmtv poll..(similar in the Daily Mail - both post Copenhagen)))
do you believe in climate change ? yes/no...
rather than:
Do you believe in (natural) climate change? yes/no
Do you believe in man made climate chnage? yes/no
And to be really honest add another:
Do you belive in catastrophic, unprecedented global warming (climate change)due to man's co2? yes/no
I imagine if the BBC(honest enough) were brave enough to ask the general public this they would get a result that they would not like.
The bbc uniquly has a duty to explain, inform and be impartial..
If the BBC behaved as it should, it would help the public understand the issue to allow informed debate..
yet, it appears that they would 'spin' the debate rather than clarify it, as they are advocates of agw not reporters/analysts.
for the record my answers:
yes
maybe yes
no
Tilde Guillemet
Thanks for your long post and your insight into our armed forces. My only comment would be that I hope the Navy’s Chief Petty Officer never has a sick day.
In para 45 (p24) of his submission, McKitrick refers to the "sexing up" of intelligence in the context of the Iraq war. This was also a link Phil Jones made to The Times (8 February 2010) talking of his "David Kelly" moment when he contemplated suicide.
There has been discussion of the span of disciplines from which the Russell Review should draw for expert analysis. The fragility of human nature when exposed - David Kelly talked to journalists and lied about it, Phil Jones couldnt document his assertions and lied about it - and the importance to society of the issues at risk from this fragility would seem to warrant its own experts from the psychiatric field.
Martyn,
I actually met the CPO and - In Navy Tradition - I'm afraid that the quality of his work improved only after quite a few morning coffees. But come-what-may it worked. However, there were a few other people with 'the knowledge' who could step in.
I'll proceed with another slight rant about ISO certification while I'm at it.
It has been suggested that standard commercial software QA processes are required to get good code. In essence ISO certification for the process is deemed to be the holy grail and that climate science in particular fails to meet that standard.
My take - and I speak as someone who has been intimately involved in the QA process and ISO certification is that that idea is frankly bollocks.
There is no requirement in the base ISO standards to have a formal plan, not even a document describing the plan. There are many interpretations of the standards and lots of pro-forma schemes have been developed but they are not essential. All that is required is that the processes in place can be shown to provide a control process over the quality of the resultant product.
At its very simplest an ISO certifiable QA scheme for wall plastering could be a verbal report of "I send the blokes out to plaster the wall and later I check to make sure they did a good job"
I strongly suggest that the ad-hoc but well established processes at CRU and many other climate organisations are in fact examples of quality control systems. There will be internal checks and balances and there is certainly an element of external review.
The fact they don't do unit tests and use any of the many arcania of software development is not important. What is important is that what comes out the door has been through a reality check to the best abilities of the people involved.
I make no comment on the actuality of CRU or any other organisation in their implementation of good scientific processes. I do make the comment that processes that are helpful to large organisations with many developers may well not be applicable to small scientific establishments where their focus is on the science, not on managing large teams of science illiterate programmers - actually in almost all cases I find pure programmers to be seriously ignorant of most of the areas they work in, and especially have no training in the scientific method - or even in any math above 1st year - which they have usually forgotten.
version control large or small is vital
Phil Jones has siad he knows he adjusted some data but had forgotten how!!!
Data control also critical.
Ross McKitrick's submission to the Muir enquiry is comprehensive and excellent. You can find it here
http://sites.google.com/site/rossmckitrick/
It will make cover up embarrassing ...
Look this review will go the way of this;
It will be minimal -
1. There will be minimal meetings.
2. Minimal discussions.
3. Minimal considerations.
4. Minimal public reporting.
5. Minimal public openess.
It will also be a done deal -
1. A draft report has probably already been written up by Boulton and circulated by Russell to the other members of the review team.
2. Pressure will be put onto the review team members to endorse the draft report's findings.
3. The PR people will be actively involved in dicussions on how and when this draft report will be ready for final publication.
4. The PR people will be actively involved in coaching review members how they should respond to public questions on how this report will be received.
We know this will be a whitewash, they know it will be a whitewash, everyones expects a whitewash. They key question for all is how successfully will the review team be able to pull off this whitewash.
Tilde Guillemet
I think you conveniently over simplify when you say,
“What is important is that what comes out the door has been through a reality check to the best abilities of the people involved”
What is also important is, traceability and history of anything that comes out of the door. An example perhaps could be Concorde first flown 2 years prior to CRU being established and retired 35 years later. 35 years of maintenance, modifications and updates all with traceable history on every component every piece of hardware and software. Then at the other end of the scale we have CRU and your plasterer.
Martyn
Traceability is not a requirement of the base ISO standards. Traceability is a derived 'product' of schemes to implement the ISO standards.
The whole point about the ISO9000 series is an effective control process for the product. Traceability may be an element but is not required.
If you go outside the ISO9000 requirements then quite possibly traceability is essential.
The discussion so far has been about using ISO9000 series quality systems. I continue to assert that the CRU - prima facie - complies. And I only say in response to comments requiring some derivative of ISO9000 that - subject to positive confirmation - the CRU appears to comply with the base requirements. Whether the compliance is well founded is another story.
Tilde G,
I couldn't agree more. Those 'standards' are all too usually an excuse for the mediocre to hide behind - sorry, all the boxes are ticked, nothing to see here, move along.
An old boss described it very well when talking about a huge research project which, despite being tediously and endlessly ISO9000 compliant (as the main contractor kept repeating), was of course, late and over budget -
'If you don't use ISO9000, you have a f*#% up. If you do use ISO9000, you have a well documented F*#% up'
Chuckles
I think I would prefer a well documented f*#% up, you may have an easier chance of some sort of correction "Harry".
I can't access the site at all, either through your link or through Google- it says Internal Server Error. Last night when I read the submissions, there were three or four that wouldn't open.
Martyn, Yes, normally I'd agree, my experience is that the preparation of 'standards' or statutary documentation, because it is a bureaucratic process/requirement, simply becomes an end in itself, and you get boilerplate, cut and paste, or drivel.
In many instances it is worse than having no documentation at all. Rather like backup systems that don't ever test the backups.
More grist for the mill.
I personally don't think the CRU and equivalent are intentionally malicious.
That they produce results that are controversial is accepted. That it is part of a global conspiracy not.
My beef about the "harry" comments is not that they show major flaws. Rather that they show an organisation in the throes of trying to get it right.
The later comments about traceability and good processes for software development miss the mark.
In the end it is the science published that is accountable.
The critique should be against the published papers and data, not the minutae of the inferred processes of how the results were obtained.
I was impressed by McKittrick`s evidence. How the Russell Committee responds to it will be very revealing.
If an aircraft falls out of the sky we can’t say it was ok when it left the factory. Investigators need to identify the problems, trace back where/why/how it went wrong, then correct the problems and that is done we the help of documentation. Perhaps it’s just horses for courses, or maybe some pride in one’s own workmanship but quality management systems, whatever title you want to give it should be in place from the concept of the product through its entire history and not just for the end result.
[quote]I was impressed by McKittrick`s evidence. How the Russell Committee responds to it will be very revealing.[/quote]
Indeed. To exonerate Jones, the committee would need to ignore almost all of McKitrick's painstakindg submission.
"My beef about the "harry" comments is not that they show major flaws. Rather that they show an organisation in the throes of trying to get it right."
Ian Harris ("harry") certainly tried his best. But please, read the countless aggravated comments that he wrote about the horrible state their data and software was in. Ian Harris was convinced that bad quality was the norm at CRU:
"This still meant an awful lot of encounters with naughty Master stations, when really I suspect
nobody else gives a hoot about. So with a somewhat cynical shrug, I added the nuclear option -
to match every WMO possible, and turn the rest into new stations (er, CLIMAT excepted). In other
words, what CRU usually do. It will allow bad databases to pass unnoticed, and good databases to
become bad, but I really don't think people care enough to fix 'em, and it's the main reason the
project is nearly a year late."
Rick Bradford: "To exonerate Jones, the committee would need to ignore almost all of McKitrick's painstakindg submission."
I'm sure they'll have no trouble doing just that.
Just returned to have a look at this thread. I don't have long now but I think there is a really important debate to be had in the coming months about the direction that barry woods was I think taking us:
and the thought-provoking response from Tilde Guillemet:
As someone who has believed in and practised from 1986 evolutionary delivery of systems as proposed by Tom Gilb, which is rightly seen (a PDF by the respected analyst Craig Larman) as the genesis of the modern agile software movement, but has also always tried to relate this to user scripting (including very importantly any kind of spreadsheet work) I am extremely sympathetic to Tilde's point here.
Yet it's also true that the standards revealed at CRU are lamentable. To take the simplest example, it's not good enough to say of source code released by the Climategate whistleblower: this was never used in anger. The only response that counts is: here is the source code that was actually used, every last line of it, in every version that was used to transform data at any point. And if they can't do that (and it's fair to assume that they can't if they haven't) then that is a total disgrace. As is making such assertions without releasing everything they can, however incomplete.
This vital matter should not be left to a few members of Muir Russell's dodgy-looking panel. It should be released to all. The Select Committee has said exactly the right thing. It's time for CRU to deliver on that, warts and all, well before Russell or Oxburgh finish their work. A lot of us could really help to get to the bottom of the matter.
"So the scientists know the data. They know what they want to get out of it. They know how to spot an error [...] If the ISO crowd have their way then every spreadsheet would have to be specified in exact detail."
I know this argument. I've seen it used. But this is more a case of an incompetent "ISO crowd" not explaining properly than it is real software engineering. Tickbox checklists are just something to threaten people with if they act like morons and don't do it properly. The ideal is to get the scientists to sit down and work out what they do need and what they don't, design a system that delivers it, and make it available without fuss.
First, the scientist has to work in partnership with the software engineers. The scientist hacks the code and gets it to work. Then the software engineer goes through it and makes it bullet proof. The scientist does all the stuff with singular value decompositions and Fourier transforms, and then the software engineer puts in the error checking, exception handling, event logging, and clever bits to maintain database consistency.
Second, the process is made easy. The software engineer sets it up so that the current state can be archived with one click of the mouse - if that. Each version separate, dated, traceable. Backups are done automatically, and tested automatically. Templates and frameworks are written that the scientist can easily plug their modules into, that make sure everything can be scripted and recorded. And those scripts and records of output can be easily archived to make up the test documentation.
The easiest way to do documentation right is to teach it. When the program is done, the scientist sits down and teaches it to somebody else - what they say and do is recorded - and at the end you now have two people who know how to use it (always a good idea), and you know what needs to be explained and what can be safely skipped.
And so on. Pay no attention to those bureaucratic ISO charlatans - but at the same time don't think it doesn't need to be done.
It does take some effort to set up, but once it is going it should be fairly transparent to the scientist. Anything that is onerous, annoying, or unhelpful gets fixed, because anything that distracts them or takes them away from what they're good at is a waste of money. You employ cleaners to vacuum the floors and caterers to staff the canteen - you don't make scientists do it all themselves, nor let them go without. Why would anyone go to the trouble and effort of employing an expensively-trained scientist, and then not give them the facilities to do their job properly? Isn't that a waste of money too?
“78 page submission from CRU”
To paraphrase Einstein:
If we were wrong, then one would have been enough.