Shade on Hulme
John Shade, who runs the Climate Lessons website left these thoughts on Mike Hulme's lecture as a comment. I thought they were worth pulling into a post of their own.
I must confess to having enjoyed the talk by Professor Hulme. He spoke clearly and in a structured way, presented lots of ideas, and generally came across to me as intelligent and thoughtful. I just wish I were bright enough or informed enough to follow all of them. As it is, I still got lots to think about from it, and the notes which follow are in part intended just to share my puzzlements and prejudices.
In an age when computer models led the British government into patently foolish and expensive policy decisions on, for example, BSE, Swine Flu, and Climate, and into many an extravagant IT venture of its own, the scope for taking the piss about governments and computer models seems very large indeed. However, this was not the aim of the conference for which Prof Hulme gave this keynote address.
To my shame, I have a picture of him as a trendy CofE vicar trying to impress the WRI with his breadth of analysis and willingness to tackle controversy, say on the use of computer-generated parables for spreading the faith. The talk will take place in the church hall, at 5.30pm next Tuesday. All are welcome. He will mention some ways in which that practice could be assessed, and give the ladies a bit of a frisson by appearing to be open to at least the possibility that all is not well with it. However, he will avoid reference to the trenchant analysis of those skeptical of such products, and stick to the safer ground of quoting only from the faithful. At no point will he give the slightest hint that the whole silly business is a shameless artifice to win people over to his side, a side which is otherwise rather short of arguments in its favour. So at the end of the talk, covering ways in which the parables and their generators might be assessed, and sprinkled with hints that they are to be taken seriously nevertheless and might foretell severe calamity, he notes that care is required when using the parable-generators in a variety of cultures. So who could argue with that? The ladies will go off into the night, content that their Vicar is a bright chap who thinks deeply about many things. A bit like I did after watching the actual talk, but with some snarkiness building up inside due to my having bits of prior knowledge and opinions of my own.
He began by noting that humans are inclined to defer to authority, be it church or state or science, and find it hard to give it up this deference. His talk will itself provide an illustration of this effect, epitomised by the respect he gives to the IPCC and to the UK Met Office, for example.
He mentions the considerable confidence asserted by the IPCC in climate models. <not always consistently with the deliberations within the IPCC however, e.g. see
http://wattsupwiththat.com/2010/10/21/climate-model-deception-%E2%80%93-see-it-for-yourself/ >
< I will use italics and angle brackets where it seems necessary to make it clear that these are my remarks, or asides, and not a report from the talk.>
‘Are climate models worthy of our deference?’ - MH He doesn’t provide a conclusive response to this, so the merit of his talk is in providing a plausible framework within which the question can be examined.
He notes that as climate models become more complex, the greater the uncertainties, Trenberth: ‘the spread in modellimg results will be larger in the next IPCC assessment’.
<There is just more that can go wrong. A tacit admission of the relative crudity and misleadingly high confidence touted by the IPCC?. I suppose this may mean a downplaying of loose talk of confidence in the next IPCC reports. They will need to find something else to keep the PR folk happy. Yet, he overlooks the fact that if this is in fact the case, the modelers are making things worse and worse for themselves by trying to include more and more. Does it not occur to him that something is fundamentally wrong here? Perhaps that such modeling, with our modest grasp of the science and the interactions, the stochastic messiness of it all, our modest programming skills, and our modest hardware, that the whole business is intrinsically impossible? Weather forecasting computer models make sense because they are dealing mainly with things that already exist, such a warm front, or a hurricane, and they provide extrapolations of their development aided by hourly or three hourly observations of the real situation. Climate models, or longer-range forecasts, on the other hand, have to deal with things that do not yet exist, and the options for them are myriad in terms of when where and how they will form and proceed, and what they will lead to next.>
He, Hulme, has developed a model, originally due to someone in the Netherlands called Arthur Peterson, with 4 'reliability’ dimensions to help pose relevant questions around whether, or to what extent, we should trust climate models. He talked through these, labelled R1 to R4, raising lots of interesting points in each. His own heart is clearly in the last of these, which is to do with the acceptance of models within cultures/societies:
R1 Coding precision. Is the computer code stable? By which he seems to mean, is it packed full of bugs or not, but that is not clear. Is the code, portable , moveable? Assessed internally by coding centres. Public demands may be made to open up the model codes. Mentions CRU being asked to release computer code for their data analysis work. <No mention of the shambles revealed by the Harry Read Me file.> Mentions the Climate Code Foundation, who want to help restore public trust in climate science by getting code published automatically and routinely. <I suspect the climate establishment is not quite ready for that.>
R2 Statistical accuracy. To do with verification or validation. Do the results resemble observed climates. Recalled his own work with model validation in 1988 to 1998. <Did not mention prediction which is more important, since models can be tweaked to fit any historical sequence.> Asserts that models of open and complex systems cannot be verified (Oreskes)?, and that‘well-rehearsed difficulties’ include the quality of observations, and the extent to which models are independent of data sets used to validate them. He notes there are many ways of judging model outputs. How is a model to be judged adequate? Refers to one assessment by Julia (he refers to her as ‘Jules’) Hargreaves, of a forecast by James Hansen at GISS. <Was it my fevered imagination, or does his delivery became a bit more hesitant, with more hums and ehs and little coughs, as he moved on to this topic?. Now she seems to have had the novel wheeze of comparing Hansen’s predicted rising trend with a temperature held constant at the 1988 value to see if there was skill in Hansen’s forecast. This is a bit like seeing a bus approaching in the distance and guessing when it will arrive, and claiming skill-superiority over a hypothetical someone who asserted that it would stay where it was. Not impressive. No mention was made of corollaries to Hansen’s temperature forecasts, such as his claim that New York would be flooded by an increase in mean sea level, and the high temperatures would cause civil unrest in the remaining dry streets (see: http://stevengoddard.wordpress.com/2010/10/04/the-rumours-of-manhattans-death-are-exaggerated/ ) . For various other forecasts or extrapolations or projected scenarios or whatever associated with computer models, the partial list to be found here is instructive about ways in which these models have failed and can be expected to fail: http://z4.invisionfree.com/Popular_Technology/index.php?showtopic=2050 (scroll down several screens to the ‘Computer Climate Models’ section) >
R3 Methodological quality. Focuses on quality of model inputs, structure, expertise captured in the model, standing of the modellers. Refers to criticisms of inappropriate levels of expertise in UK Met Office modeling made in he House of Lords some years ago <What’s this?> Raises question of how to interpret differences between models. IPCC uses ‘multimodel ensembles’ and uses the spread to assess uncertainty as if they were all independent. MH notes that the models are not independent.
R4 Social acceptability. Requires scrutiny of ‘climate models as social objects in social worlds’. Which ‘political actors are offering their endorsements’, or even ‘bearing witness’ for or about these models. Refers to his PhD student, Martin Mahoney’s work on PRECIS. This is a Met Office model, distributed to over 100 countries in last 10 years or so. PRECIS seen as effective for ‘raising awareness’. It ‘can make climate change real’. It can ‘attract public endorsements’. It was said by some player that it had been ‘useful to have the UNDP seal of approval on it’. Clearly PRECIS has been found by activists to be useful for ‘convincing policy makers that they should take a stand’. Hulme also claimed that the ‘epistemic authority of the model’ is safeguarded by its continued links to the UK Met Office. <Now a Met Office governed by the man who turned the WWF into a poltical organisation zealously campaigning on AGW, and which backed-away from seasonal forecasting after making a fool of itself with confident talk of snowless winters and BBQ summers, would not appeal to me as a source of ‘epistemic authority’ if I was to be marketing climate models to anyone other than the most gullible of governments. I suspect the apparent success of PRECIS will one day make a useful study for anthropologists trying to make sense of the late 20th and early 21st century adoration of computer models and the ways in which they seemed to overwhelm policy makers.>
A statement towards the end of his talk: ‘Models offer the single most powerful way for scientists to organise their knowledge’. <I am inclined to the view that they can get in the way of scientists’ knowledge. I remember when I studied theoretical physics, as a not very bright student – probably the least able in my class, but still having disdain for those who resorted to computers to get results, often in very crude, inelegant ways, rather than grappling with theory and experiment and new ideas. Computer modeling easily became an end in itself, with its own language and challenges, and it seemed like something which contributed nothing of intellectual value, albeit being most welcome for doing arithmetical chores. In this way, I suspect the apparent rise to prominence of computer models has damaged progress with climate science, diverting funds and fame away from work of more modest import but of more lasting value. As it is, it seems sometimes, to me as a rank outsider, that climate science has degenerated into a study of climate models.>
Another interesting remark towards the end: ‘Understanding the social authority of climate models becomes critically important’. <This I think is what particularly interests Prof Hulme, and which may have led to the otherwise misleading title of his book, ‘Why We Disagree about Climate Change’ a book which concentrates on social and political topics, by and large taking the IPCC line on climate as a given..>
Final slide showed that UK Met Office climate projections totally subject to climate models. The Netherlands on the other hand, were much more eclectic.
‘UK 2009 – scenarios extracted from one model hierarchy from one modeling centre funded by the British Government. Offered sophisticated probabilities of outcomes at fine temporal (hours) and spatial (5 km) scales.
Netherlands 2006 – scenarios which used a synthesis of insights and methodologies: climate modeling, historical evidence, local meteorological reasoning. Expert judgement to blend different information sources.’
<Now I haven’t seen either or them. I suspect the UK one is fine-grained baloney, replete with snowless winters and long hot summers, and that the one from the Netherlands is altogether more grounded in reality. But that’s mere prejudice, from someone whose first formal instruction in FORTRAN came from someone who advised that computers should not be used for anything where you do not know what the right answer is. Now I see that that can include the writing of software to provide predetermined answers of highly desirable ‘rightness’, such as ‘exposing’ a dominating role for CO2 in the atmosphere.>
Final quote, referring to the IPCC ‘Its authoritative deployment of climate models and their simulated knowledge of putative future climate becomes dangerous if, I would suggest, in doing so it erases the differences that exist between cultures and societies in the different forms of social authority that are granted to climate models.’
<I still don’t know what this means. He read it out very deliberately from his notes, and so it is not just a remark thrown in at haste to conclude his talk. He seems a bright chap, much brighter than me, so I think there will be a lot of interesting ideas in and around it. I just need more help to see them. >
Overall, an interesting talk, one in which Prof Hulme has provided a useful framework for assessing computer models, but a talk in which he has shied away from any ruthless application of the framework. It is as if he is reasonably content with the way in which things have been going with the IPCC and its impact on world affairs, and it is only his intellectual curiosity that draws him to talk of model impacts in society. But for me, and many others, our main concern is that this impact, this use of computer models, has been excessive and harmful, and could yet cause a great deal more harm before the dust they deserve has settled upon them.
Reader Comments (36)
Mike Hulme displays poor knowledge of the internal workings of a computer and the impact of high level languages. As someone who has worked at register level it is logical consistency that is the major player in writing successful code. If you want precision you simply buy it.
"I still don’t know what this means."
I guess he means that they're putting all their eggs in one 'climate modelling' basket. If climate models lose credibility, so does the message.
I note none of his four dimensions seem to include making verifiable predictions which have been borne out by observation.
It would be good to get the IT governance community to have a go at standards for computer models for public policy. This article illustrates the need for a proper framework (which would be much more substantial than the items above), and after so many horrendous failures, such a framework is overdue. It is hard to believe that such a framework doesn't exist.
I'd love to know more about parameterisation. That is, in the model, how many numbers necessary to do the calculations are fixed and how many are parameters which are changed between runs. Do these parameters include such things as the CO2 sensitivity, or does that number emerge as a result because fundamental measured numbers are used? Pciking the parameters correctly is the key to getting the results correct. The problem is that if you have some idea of the result you want, you will tend to pick parameters to achieve that result, and reject other results as 'bad runs'. We don't get told much about the parameters. We don't get told anything about bad runs.
I heard him say that climate models could not even in theory be verified and attempts to verify them had met with little success. He went on at length about the social authority of climate models and ways to sell it.
If the models could be verified, there would be no need to use various puffs to sell them. An unverified or unverifiable computer model amounts to hi-tech guesswork at best, and at worst merely a device which can be tweaked into selling a political agenda as something with the authority of science. If a political agenda has to use such dishonest means, it can hardly be one that would be welcomed on its merits.
It leaves me wondering whether climate models are worth spending much effort developing at all and certain that basing far-reaching public policy decisions on them is wrong.
From Hansard and Professor Slingo's testimony:
From a BBC news article
By my calculations an annual bias of just +0.05C per annum from, according to Professor Slingo, the same code used for their climate models, results in a bias of +0.5C per decade or +5C per century. Isn't 5C roughly about what they are predicting for the next 100 years?
Admittedly, they may use the same code for daily forecasts and climate models, but completely different code for annual forecasts although I doubt it.
I don’t see Mike Hulme as a particularly helpful individual. Much too postmodern for my taste. I think we lose sight of the scientific method as our peril, because this is our criterion for effective behaviour – that is behaviour that is effective with respect to the real world and our environment behaves as we expect it to. Mike Hulme seems to favour behaviour that is effective socially in that it is socially or culturally cohesive, even if by more objective standards it is also incompetent. It is surely the case that the incompetence will eventually cause problems.
Yesterday, Phillip Bratby made the comment
Hulme often talks in riddles that make no sense to a normal person. I think it's deliberate obfuscation.
And today Mac noted Mike Hulme displays poor knowledge of the internal workings of a computer and the impact of high level languages.
I agree entirely with both comments. And to use an old Silicon Valley marketing ploy "When you can't dazzle them with brilliance, baffle them with bullshit."
I am not at all impressed by Mike Hulme. Perhaps he should try being a shaman.
Oh, Mac while I agree you have to buy the right hardware to get precision, one still has to use it properly. The problem is most graduate student programmers I worked with haven't much knowledge and make really dumb design and coding mistakes. I suspect Mike Hulme is an example.
@ Rhoda:
You will find this interesting:
http://judithcurry.com/2010/10/10/the-culture-of-building-confidence-in-climate-models/
Make sure to read the post at Steve Easterbrook's blog and the Pope & Davies paper Curry links to in her essay.
I am not sure that Mike Hulme and everybody else are talking the same language, particularly with regards to verification and validation of the model (or code). I know that in the past they had the opposite meaning in the US compared to Britain.
Verification is ensuring that the coding produced is exactly as specified. Thus if the equation is specified as y=ax+c and is coded as y=ax-c, then verification should pick up and correct that error. A verified code would not contain such errors. Verification would show any shortcomings in the code that need addressing - such as: is the output sensitive to the grid size used; is the ouput consistent at different timestep lengths. There are dozens of things that verification should address to ensure that the output reflects the intent of the code specifier.
Validation is confirming that the code agrees with experimental data for the wide range of conditions for which the code will be used (to within specified uncertainties). The validation will confirm over what conditions the code is valid. Obviously for calculating the earth's climate, experimental data is real world data, and as I stated yesterday, back-casting is not ideal as it is prone to bias and a limited range of data.
The problem with parameterisation is that again it is subject to bias and is not applicable to all conditions. Ideally a climate model should be calculating the climate from first principles. It is highly unlikely that a parameterised model (if that is the correct description) is validated for all the conditions to which it will be applied.
These are the things you learn from on the job experience in an environment in which best practice and rigorous QA are used and I suspect that Hulme has little if any experience of developing or using a climate model and so is just pontificating (BSing) from a position of zero knowledge (but I must admit I hasve not listened to his talk).
Quoting Shade: "As it is, it seems sometimes, to me as a rank outsider, that climate science has degenerated into a study of climate models."
I have been saying this for a long time; sorry if I sound like a broken record. This observation is detailed in Lindzen's paper: "Climate Science: Is it currently designed to answer questions?"
http://icecap.us/images/uploads/ClimateScience-arXiveRSLindzenRev3a.pdf
"One result of the above appears to have been the deemphasis of theory because of its intrinsic difficulty and small scale, the encouragement of simulation instead (with its call for large capital investment in computation), and the encouragement of large programs unconstrained by specific goals. In brief, we have the new paradigm where simulation and programs have replaced theory and observation, where government largely determines the nature of scientific activity, and where the primary role of professional societies is the lobbying of the government for special advantage."
His paper is quite long, but a reading of the first 9-10 pages details the message.
The Lindzen essay Dr Crinum mentions above is worth reading in full. Inter alia, it details the way that major institutions - eg the US National Academy of Sciences (NAS) - were colonised by active protagonists of CAGW. In other words, the Rise of the Consensus!
Paraphrasing John Shade:
In 2000, one of the greatest works of interactive fiction and quite possibly the greatest computer game of all time, the terrifyingly prescient and brilliant Deus Ex was released to a somewhat mute reception. As computer scientists, programmers and users, if you haven't done so yet - just buy the game and give it a week of your time - the experience can be eye-opening.
There is a conversation between 'Morpheus' and JC Denton, the lead character. (watch here). Morhpeus is a computer surveillance system. It goes like this....
JC DENTON: I don't see anything amusing about spying on people.
MORPHEUS: Human beings feel pleasure when they are watched. I have recorded their smiles
as I tell them who they are.
JC DENTON: Some people just don't understand the dangers of indiscriminate surveillance.
MORPHEUS: The need to be observed and understood was once satisfied by God. Now we can
implement the same functionality with data-mining algorithms.
JC DENTON: Electronic surveillance hardly inspired reverence. Perhaps fear and obedience,
but not reverence.
MORPHEUS: God and the gods were apparitions of observation, judgment, and punishment.
Other sentiments toward them were secondary.
JC DENTON: No one will ever worship a software entity peering at them through a camera.
MORPHEUS: The human organism always worships. First it was the gods, then it was fame (the observation and judgment of others), next it will be the self-aware systems you have built to realize truly omnipresent observation and judgment.
JC DENTON: You underestimate humankind's love of freedom
MORPHEUS: The individual desires judgement. Without the desire, the cohesion of groups is impossible, and so is civilization.
The comments have been a tad unfair to Hulme. He is not a model-worshipper but does defer to most of the prevailing assumptions that inform them. Unlike Mann et al, he rejects the notion that a reasonable inference of risk from AGW does not as a matter of logical necessity support the kinds of central government action that the warmistas invariably tout such as cap and trade.
There is so much kneejerk nonsense in climate-debating that when someone like Hulme is making an intelligent effort to operate outside the dogmatic box, it should spark interest and engagement and not a kneejerk suspicion that his reasonable pose is a snare for the unwary climate skeptic.
As I read these comments, I am curious about the meaning of "parameterisation", as is Rhoda, who raises some interesting questions.
If one has enough knobs and levers, you can make just about anything match anything you would like. It is the old "Graduate Student Algorithm" -- you know, lock up a graduate student in an office or lab, throw in some candy bars and cans of coke from time to time and don't let him out until he produces the desired result.
The more I read about Mike Hulme's approach, the less I like it. It reminds me of my graduate student days too much.
Good post. Very interseting.
IMHO, ultimately, climate models in themselves have no intrinsic authority; they are just a series of zeros and ones. The authority with which Hulme so keenly speaks, and the deference he suggests we hold for models, is not in fact for the models at all but for those that create the models: the deep climate Vatican: the modellers and agencies involved in their manufacture and management.
Hulme infers that they are ultimately tools to shape behaviour, not to shape a distant snapshot of future climate. I think we can see that the thrust of Hulme’s lecture is a confirmation of his view of climate science as a post-normal activity – because he sees climate models as sophisticated levers of cultural and societal change, where the emphasis is not on the accuracy, objectivity or scientific validity of them, but on the moral, ethical and societal issues that result from “projections”. The media invariably portray simulations as predictions, and that prediction attracts the superstitious deference he exemplifies.
The models appear – at least in Hulme’s mind – as a kind of automated cyber-policy contraption, where ludicrously complex mathematical formulae representing air, wind, rain and sea are fed in, but out pops (pre-determined) justification for societal change – on a global scale.
Hulme’s interests appear to be in suggesting that models themselves have authority when they do not; in suggesting the deference we have is for this wondrous mechanism and not the agenda of those that conceive its use. Why? Perhaps because he thinks we are stupid.
@ justin ert
Agree. Good comment.
Also, Hulme is a Christian (as is, for example Sir John Houghton) and there is an interesting undercurrent of Christian morality in the mix of motivations here. Consider the emergent concepts of climate justice and responsibility (atonement) advanced on many fronts but originated by such as Hulme and Houghton.
justin ert,
"Hulme’s interests appear to be in suggesting that models themselves have authority when they do not; in suggesting the deference we have is for this wondrous mechanism and not the agenda of those that conceive its use. Why? Perhaps because he thinks we are stupid."
I don't think it's exactly Hulme thinking we are stupid.
He's a man on a mission with Post Normal Science, the idea that science should bend to fit political ends. Presumably these are ends of which he approves rather than suggesting science should be a gun for hire for any given political end. Frankly, I consider PNS to be outrageous.
Science bent to fit support of a political end is not science at all, it's scientific sounding blather. We've been there before with Lysenkoism.
While not being stupid, relatively few people feel they are in a position to call bullshit on scientific blather or computer climate models. I think these climate models are a device politicians use to convince themselves as much as the populus. The scientific establishment has put itself in a position where climbing down would be hard even if it wanted to.
I think most people revere computers. Often times those who do not learn how to operate a PC do so out of mystical fear. Can not computers perform tasks almost instantaneously that humans could not realistically accomplish without a massive investment of time and possibly travel? Do not computers provide answers for most inquiries? Computers are everywhere, our homes, our cars, our work, our schools, our phones, our pockets, etc.. And how many computers are now being utilized for surveillance? You are being watched where ever you go ... by a computer/computer controlled device.
We rely upon computers for many things. We trust them -- computers never make mistakes, only the operators and programmers. Computers coupled with the Internet become authoritarian. Who can argue against computer results? We even talk to computers via voice recognition software while we are driving our cars or performing computer tasks. Computers don't just give us answers, they often tell us what to do. Computers make us feel important; we can all become 'somebody' on the Internet via our computer. Our dependence on computers is such that none of us could bear to live without them.
In the background, how many people have subconscious thoughts concerning artificial intelligence and how it may overtake thinking for mankind? 2001, The Space Odyssey delivered a lingering concept -- a miniature HAL may be lurking in all of our computers.
I think Shub is on the right track. We are close to worshiping computers as a higher form of intelligence. Perhaps the AGW scientific crowd is already there.
Hulme (unexpectedly) has been critical of both Climategate and the IPCC (since Climategate). Agree with you cosmic - he is a post-normal scientist and sees the situation as a politician, hedging his bets - which is why what some of what he says appears ambiguous or confusing to us.
cosmic
"While not being stupid, relatively few people feel they are in a position to call bullshit on … climate models".
I agree, and I think this is the power of the models Hulme is attempting to harness - the deference he seeks to promote. Conveniently, the depth and sheer complexity of their structure, programming and code-base is so labyrinthine to be beyond even the most authoritative criticism. I can only see this as the castle’s keep for deep climate: an impenetrable wall of complexity protecting an elite and their final refuge from an assault.
R3 "Methodological quality...focuses on...standing of the modellers...."
R4 "Social acceptability...[yatta-yatta]...PRECIS seen as effective for ‘raising awareness’. It ‘can make climate change real’. It can ‘attract public endorsements’...[yatta-yatta]...‘convincing policy makers that they should take a stand’.
Kudos to MH as a person, but this material has a very low signal to drivel ratio. Climate models are to science as a dirty jockstrap is to a ball gown, and always will be.
justin ert,
"Conveniently, the depth and sheer complexity of their structure, programming and code-base is so labyrinthine to be beyond even the most authoritative criticism."
That's being lured into going for the wrong target and the one which is offered. Rather than discussing modelling software methodologies, differential equations and all sorts of other technicalities, the simple and important question is "Do these models work? That is do they demonstrate useful predictive ability?". If they don't, it doesn't matter how elegant the maths behind them or how clever the people who produce them are, they don't. If we are not sure whether they work or not, we're not confident that they do work, so they are no basis for serious policy decisions.
It's also convenient that there are long timescales involved, making predictions decades ahead. All that means is it's harder or impossible to validate them, so all the more reason to distrust them.
As for Hulme's point that it doesn't matter whether they work or not, they are there to illustrate scenarios so as to provide social motivation, I see that as an admission they are a propaganda tool and he sees no problem in hoodwinking people with them.
A computer is an adding machine that works very fast.
It is flexible (can address any problem that can be expressed in mathematical terms).
It can be automated (programmed).
It cannot think (its output is pre-determined by the data plus the assumptions coded into the program).
It can only and will only carry out the instructions of those who program it.
It is their assumptions that must be analysed and either accetped or rejected.
So do not concern yourself with climate models.
Instead analyse the assumptions of climatologists.
Two model runs A and B, to predict the temp increase at the end of century.
The parameters for the runs are set such that are both believed to produce valid results.
Model Run A produces a temp difference of 3 degrees
Model Run B produces a temp difference of 100 degrees
Ask the "man in the street" which model run is the better? 95% will say Run A: "Obviously, model run B is totally wrong".
Of course both model runs were designed with "intelligent" decisions on parameter settings. They are actually both equally valid. We understand about confirmation bias. That is not the point though.
Explain to the man in street that only different between the models runs was that one parameter was set as 1.00000005 in run A and at 1.00000006 in Run B.
And providing they even understand what a parameter is, they will never understand the reality of modelling things that cannot be matched (or even calibrated) against something independent and known.
"a very low signal to drivel ratio"
Very apposite!
Casually, the following line begins a new paragraph in the Wikipedia entry on global climate models.
This apparently simple sentence contains so many hidden assumptions, and simplifies to the extent of falsification, what a computer climate model is.
To start with,
Computer climate model variability in output or, computer climate model 'skill' is not equal to 'climate prediction uncertainty'.
Moreover, the Wikipedia entry includes no references whatsoever to criticism or problems associated with a model-centric approach, or climate models specifically. There are a few mumblings about output uncertainty and such, but not citations of systematic examinations of model-based climate thinking.
Such citations can be found in this document, for example: (posted this at Bart's blog some time back).
How does the theologizing of physics contribute to global warming?
Whilst I respect anyone's right to pursue any analysis in any sphere, the persistence of an analysis that would have been considered fashionable in a 1960s sociology department into a 21st century environmental science department must be anomalous. He appears to do sound science, its a shame he talks gibberish.
It's not altogether clear that his strange sociological musings haven't influenced other members of his department to their disadvantage.
'Next to our romanticized conceptions of how fundamental physics works, climate science looks like cheap hucksterism, and this is the source of many of the skeptical concerns about global climate models.'
A quote from the paper just linked to by Shub. It is one sentence from a 37-page apologium (apologia?) on climate modelling which I can readily agree with. I'll need to study the rest more carefully to see if there are any more. The author, by the way, has stated the above quote only in order to refute it.
As Founding Director of the Tyndall Centre, Hulme has been responsible for much of the computer modelling scares. He now comes across as Mr Reasonable, the Sceptics friend. For a compendium of Tyndall Press Releases whilst he was in charge at Tyndall, see here, starting with this in 2000:
Dr Mike Hulme, the Centre’s Executive Director, said: "Society is at last waking up to climate change.What might once have been considered unusual weather conditions for the UK– the recent storms and flooding, for example– are likely to be much more frequent occurrences.
http://www.scribd.com/doc/26738398/Tyndall-Press-Statements
For more on how Tyndall works see here:
http://scienceandpublicpolicy.org/reprint/social_construction.html
And of course there is his famous line in his book, "Ask not what you can do for climate change, ask what climate change can do for you"
Trenberth actually said:
"So here is my prediction: the uncertainty in AR5's climate predictions and projections will be much greater than in previous IPCC reports...But while our knowledge of certain factors does increase, so does our understanding of factors we previously did not account for or even recognize."
This means the IPCC has much less confidence in their latest climate model predictions than in the previous IPCC report's climate predictions. The climate is a chaotic system, modeling any chaotic system is next to impossible but modeling our vast, incredibly complex, incompletely understood, inadequately measured climate system is way, way beyond our current capabilities. Anyone who has faith that the current GCMs can accurately predict our future climate is either a propagandist and/or does not know what they are talking about.
DrCrinum wrote:
Which is not in the least surprising considering the following excerpts from his Why we disagree about climate change [not that I've read the book, but Richard Lindzen highlighted these in one of his presentations]:
Propping up very tarnished “gold standard” of IPCC’s “plastic” climate change
Perhaps these were "taken out of context" but somehow I doubt it. And I, for one, cannot imagine any context which would make the words less damning (not to mention less unscientific!)
It's also worth noting that it was Hulme who shrunk the "consensus" he was so instrumental in building back in the pre-Kyoto days.
ooops ... sorry, it was cosmic's comment, not DrCrinum's to which I was responding. Although I prefer to think that neither would have a problem with my follow-up observations :-)
Hi, Could John Shade drop me an email (I need to give him an apology, I mixed him up with someone else (another JS) and would love to have a chat....
I have just 'played' an online uk geography schools key stage 4 'game' entitled
Operation Climate Control - Funded by DEFRA...
In my mind it is WORSE than 10:10 no pressure..polite words are currently failing me.....
I will put something detailed together later...
a snippet...
You get hit in the face with a cartoon sea mine 'cartoon' (which explodes in a cartoon manner' everytime you get a survey style question 'wrong'
a question:
Q: PLANTS CAN'T ABSORB ENOUGH Carbon Dioxide
"Plants absorb some of the CO2 in the atmosphere, but as we put more into the atmosphere, and chop down more forests, they can't absorb as much, which makes climate change worse"
Choose How To Deal With This Climate Change Impact
A1: Don't plant more trees, focus on reducing C02 emissions
A2: Plants more trees to absorb carbon
A3: Ecorage the timber industry by builing more buildings from wood
Their correct answer is A1, reduce emmisions, anything else gets an exploding(cartoon) sea mine (spikey world war2 sort) in the face.
A cute cliched 'cartoon youth' character called me a selfis meany at one point.
The game is class focussed, ie registered and the whole class signs in, so everyone can see the 'scores'
no pressure...
Aims Key Objectives
Operation: Climate Control is designed for Key Stage 4 students learning about climate change for their GCSEs. The game isn't going to teach about the scientific link between carbon dioxide emissions and climate change, that is assumed. The game concentrates on the impacts of climate change, and what can be done about it.
Play the current Key Stage 4 Geography: funded by DEFRA game yourself.
It is dated as 2007, so three years old I wonder how many key stage 4 have played this....
Operation Climate Control.
http://www.operationclimatecontrol.co.uk/new
Some one got their MSc writing about the efect of this companies PREVIOUS climate change game..
"The game has been developed by Red Redemption Ltd., a leading environmental games company, who have previously developed Climate Challenge a similar game aimed at young professionals for BBC Science and Nature."
Lets repeat that again..
BBC SCIENCE AND NATURE
The MSc Thesis:
MSc Dissertation: Environmental Change and Management
Environmental Change Institute, University of Oxford
http://www.economics.ox.ac.uk/members/cameron.hepburn/Rowlands(2006).pdf
"Communication of Climate Change
How is the climate change message put across to the general public?
Climate change is a global problem that should be addressed at every level of society, from the
household to business to government (Defra, 2006a). This means that explaining the issues
surrounding climate change to the general public is of vital importance, if everyone is to take
measures to reduce carbon dioxide emissions.
Governments have a role to play in this, and have set up public awareness campaigns, such as the UK Department for Environment, Food and Rural Affairs (Defra) campaign Climate Challenge, which has the aim to “to educate, excite and inspire” people about climate change (Climate Challenge, 2006).
Indeed, a future version of the Climate Challenge game aimed at school children at key stages 3 and 4 will be funded by the Climate Challenge fund, associated with the campaign."
See the appendix (pg 75) on results of a survey from the game....
4. Of those gases, which do you think, overall, contributes the most towards climate
change?
Argon 0%
Nitrous Oxide 0%
Carbon Dioxide 77%
Oxygen 1%
Chlorofluorocarbons (CFCs) 4%
Ozone 4%
Helium 0% Perfluoromethane 0%
Hydrofluorocarbons (HFCs) 1%
Water vapour 5%
Methane 6%
Don't know 2%
Nitrogen 0%