John Shade, who runs the Climate Lessons website left these thoughts on Mike Hulme's lecture as a comment. I thought they were worth pulling into a post of their own.
I must confess to having enjoyed the talk by Professor Hulme. He spoke clearly and in a structured way, presented lots of ideas, and generally came across to me as intelligent and thoughtful. I just wish I were bright enough or informed enough to follow all of them. As it is, I still got lots to think about from it, and the notes which follow are in part intended just to share my puzzlements and prejudices.
In an age when computer models led the British government into patently foolish and expensive policy decisions on, for example, BSE, Swine Flu, and Climate, and into many an extravagant IT venture of its own, the scope for taking the piss about governments and computer models seems very large indeed. However, this was not the aim of the conference for which Prof Hulme gave this keynote address.
To my shame, I have a picture of him as a trendy CofE vicar trying to impress the WRI with his breadth of analysis and willingness to tackle controversy, say on the use of computer-generated parables for spreading the faith. The talk will take place in the church hall, at 5.30pm next Tuesday. All are welcome. He will mention some ways in which that practice could be assessed, and give the ladies a bit of a frisson by appearing to be open to at least the possibility that all is not well with it. However, he will avoid reference to the trenchant analysis of those skeptical of such products, and stick to the safer ground of quoting only from the faithful. At no point will he give the slightest hint that the whole silly business is a shameless artifice to win people over to his side, a side which is otherwise rather short of arguments in its favour. So at the end of the talk, covering ways in which the parables and their generators might be assessed, and sprinkled with hints that they are to be taken seriously nevertheless and might foretell severe calamity, he notes that care is required when using the parable-generators in a variety of cultures. So who could argue with that? The ladies will go off into the night, content that their Vicar is a bright chap who thinks deeply about many things. A bit like I did after watching the actual talk, but with some snarkiness building up inside due to my having bits of prior knowledge and opinions of my own.
He began by noting that humans are inclined to defer to authority, be it church or state or science, and find it hard to give it up this deference. His talk will itself provide an illustration of this effect, epitomised by the respect he gives to the IPCC and to the UK Met Office, for example.
He mentions the considerable confidence asserted by the IPCC in climate models. <not always consistently with the deliberations within the IPCC however, e.g. see
http://wattsupwiththat.com/2010/10/21/climate-model-deception-%E2%80%93-see-it-for-yourself/ >
< I will use italics and angle brackets where it seems necessary to make it clear that these are my remarks, or asides, and not a report from the talk.>
‘Are climate models worthy of our deference?’ - MH He doesn’t provide a conclusive response to this, so the merit of his talk is in providing a plausible framework within which the question can be examined.
He notes that as climate models become more complex, the greater the uncertainties, Trenberth: ‘the spread in modellimg results will be larger in the next IPCC assessment’.
<There is just more that can go wrong. A tacit admission of the relative crudity and misleadingly high confidence touted by the IPCC?. I suppose this may mean a downplaying of loose talk of confidence in the next IPCC reports. They will need to find something else to keep the PR folk happy. Yet, he overlooks the fact that if this is in fact the case, the modelers are making things worse and worse for themselves by trying to include more and more. Does it not occur to him that something is fundamentally wrong here? Perhaps that such modeling, with our modest grasp of the science and the interactions, the stochastic messiness of it all, our modest programming skills, and our modest hardware, that the whole business is intrinsically impossible? Weather forecasting computer models make sense because they are dealing mainly with things that already exist, such a warm front, or a hurricane, and they provide extrapolations of their development aided by hourly or three hourly observations of the real situation. Climate models, or longer-range forecasts, on the other hand, have to deal with things that do not yet exist, and the options for them are myriad in terms of when where and how they will form and proceed, and what they will lead to next.>
He, Hulme, has developed a model, originally due to someone in the Netherlands called Arthur Peterson, with 4 'reliability’ dimensions to help pose relevant questions around whether, or to what extent, we should trust climate models. He talked through these, labelled R1 to R4, raising lots of interesting points in each. His own heart is clearly in the last of these, which is to do with the acceptance of models within cultures/societies:
R1 Coding precision. Is the computer code stable? By which he seems to mean, is it packed full of bugs or not, but that is not clear. Is the code, portable , moveable? Assessed internally by coding centres. Public demands may be made to open up the model codes. Mentions CRU being asked to release computer code for their data analysis work. <No mention of the shambles revealed by the Harry Read Me file.> Mentions the Climate Code Foundation, who want to help restore public trust in climate science by getting code published automatically and routinely. <I suspect the climate establishment is not quite ready for that.>
R2 Statistical accuracy. To do with verification or validation. Do the results resemble observed climates. Recalled his own work with model validation in 1988 to 1998. <Did not mention prediction which is more important, since models can be tweaked to fit any historical sequence.> Asserts that models of open and complex systems cannot be verified (Oreskes)?, and that‘well-rehearsed difficulties’ include the quality of observations, and the extent to which models are independent of data sets used to validate them. He notes there are many ways of judging model outputs. How is a model to be judged adequate? Refers to one assessment by Julia (he refers to her as ‘Jules’) Hargreaves, of a forecast by James Hansen at GISS. <Was it my fevered imagination, or does his delivery became a bit more hesitant, with more hums and ehs and little coughs, as he moved on to this topic?. Now she seems to have had the novel wheeze of comparing Hansen’s predicted rising trend with a temperature held constant at the 1988 value to see if there was skill in Hansen’s forecast. This is a bit like seeing a bus approaching in the distance and guessing when it will arrive, and claiming skill-superiority over a hypothetical someone who asserted that it would stay where it was. Not impressive. No mention was made of corollaries to Hansen’s temperature forecasts, such as his claim that New York would be flooded by an increase in mean sea level, and the high temperatures would cause civil unrest in the remaining dry streets (see: http://stevengoddard.wordpress.com/2010/10/04/the-rumours-of-manhattans-death-are-exaggerated/ ) . For various other forecasts or extrapolations or projected scenarios or whatever associated with computer models, the partial list to be found here is instructive about ways in which these models have failed and can be expected to fail: http://z4.invisionfree.com/Popular_Technology/index.php?showtopic=2050 (scroll down several screens to the ‘Computer Climate Models’ section) >
R3 Methodological quality. Focuses on quality of model inputs, structure, expertise captured in the model, standing of the modellers. Refers to criticisms of inappropriate levels of expertise in UK Met Office modeling made in he House of Lords some years ago <What’s this?> Raises question of how to interpret differences between models. IPCC uses ‘multimodel ensembles’ and uses the spread to assess uncertainty as if they were all independent. MH notes that the models are not independent.
R4 Social acceptability. Requires scrutiny of ‘climate models as social objects in social worlds’. Which ‘political actors are offering their endorsements’, or even ‘bearing witness’ for or about these models. Refers to his PhD student, Martin Mahoney’s work on PRECIS. This is a Met Office model, distributed to over 100 countries in last 10 years or so. PRECIS seen as effective for ‘raising awareness’. It ‘can make climate change real’. It can ‘attract public endorsements’. It was said by some player that it had been ‘useful to have the UNDP seal of approval on it’. Clearly PRECIS has been found by activists to be useful for ‘convincing policy makers that they should take a stand’. Hulme also claimed that the ‘epistemic authority of the model’ is safeguarded by its continued links to the UK Met Office. <Now a Met Office governed by the man who turned the WWF into a poltical organisation zealously campaigning on AGW, and which backed-away from seasonal forecasting after making a fool of itself with confident talk of snowless winters and BBQ summers, would not appeal to me as a source of ‘epistemic authority’ if I was to be marketing climate models to anyone other than the most gullible of governments. I suspect the apparent success of PRECIS will one day make a useful study for anthropologists trying to make sense of the late 20th and early 21st century adoration of computer models and the ways in which they seemed to overwhelm policy makers.>
A statement towards the end of his talk: ‘Models offer the single most powerful way for scientists to organise their knowledge’. <I am inclined to the view that they can get in the way of scientists’ knowledge. I remember when I studied theoretical physics, as a not very bright student – probably the least able in my class, but still having disdain for those who resorted to computers to get results, often in very crude, inelegant ways, rather than grappling with theory and experiment and new ideas. Computer modeling easily became an end in itself, with its own language and challenges, and it seemed like something which contributed nothing of intellectual value, albeit being most welcome for doing arithmetical chores. In this way, I suspect the apparent rise to prominence of computer models has damaged progress with climate science, diverting funds and fame away from work of more modest import but of more lasting value. As it is, it seems sometimes, to me as a rank outsider, that climate science has degenerated into a study of climate models.>
Another interesting remark towards the end: ‘Understanding the social authority of climate models becomes critically important’. <This I think is what particularly interests Prof Hulme, and which may have led to the otherwise misleading title of his book, ‘Why We Disagree about Climate Change’ a book which concentrates on social and political topics, by and large taking the IPCC line on climate as a given..>
Final slide showed that UK Met Office climate projections totally subject to climate models. The Netherlands on the other hand, were much more eclectic.
‘UK 2009 – scenarios extracted from one model hierarchy from one modeling centre funded by the British Government. Offered sophisticated probabilities of outcomes at fine temporal (hours) and spatial (5 km) scales.
Netherlands 2006 – scenarios which used a synthesis of insights and methodologies: climate modeling, historical evidence, local meteorological reasoning. Expert judgement to blend different information sources.’
<Now I haven’t seen either or them. I suspect the UK one is fine-grained baloney, replete with snowless winters and long hot summers, and that the one from the Netherlands is altogether more grounded in reality. But that’s mere prejudice, from someone whose first formal instruction in FORTRAN came from someone who advised that computers should not be used for anything where you do not know what the right answer is. Now I see that that can include the writing of software to provide predetermined answers of highly desirable ‘rightness’, such as ‘exposing’ a dominating role for CO2 in the atmosphere.>
Final quote, referring to the IPCC ‘Its authoritative deployment of climate models and their simulated knowledge of putative future climate becomes dangerous if, I would suggest, in doing so it erases the differences that exist between cultures and societies in the different forms of social authority that are granted to climate models.’
<I still don’t know what this means. He read it out very deliberately from his notes, and so it is not just a remark thrown in at haste to conclude his talk. He seems a bright chap, much brighter than me, so I think there will be a lot of interesting ideas in and around it. I just need more help to see them. >
Overall, an interesting talk, one in which Prof Hulme has provided a useful framework for assessing computer models, but a talk in which he has shied away from any ruthless application of the framework. It is as if he is reasonably content with the way in which things have been going with the IPCC and its impact on world affairs, and it is only his intellectual curiosity that draws him to talk of model impacts in society. But for me, and many others, our main concern is that this impact, this use of computer models, has been excessive and harmful, and could yet cause a great deal more harm before the dust they deserve has settled upon them.