Click images for more details



Recent posts
Recent comments
Currently discussing

A few sites I've stumbled across recently....

Powered by Squarespace
« UN announces review of IPCC procedures | Main | Power down »

Predicting climate 100 years from now

These are notes of a lecture given by Prof Tim Palmer on some of the fundamentals of weather prediction. The notes were taken by Simon Anthony. This is well worth a read, and I'm certainly struck by how little we know about how to forecast the climate.

If we can't forecast next month's weather, what hope for predicting climate 100 years from now?

Lecture at Dept of Earth Sciences, University of Oxford by Professor Tim Palmer, Royal Society Professor at Oxford, previously at European Centre for Medium Range Weather Forecasts.

[In contrast to simplistic fixed view of climate change preferred by journalists and politicians, TP adopts more traditional scientific view: create and develop models, make predictions, compare predictions with actual measurements, revise/replace models, try to understand models’ limitations. He seems happy to talk about uncertainties.  That said, he did sign the Met Office “Statement from the UK Science Community”… .  Taken together with his final suggestion of the need for a “CERN for Climate”, I’d say he seems like a good scientist who believes in the importance of his science, trying to argue the best case for that science but not necessarily too concerned about “collateral damage”.]

Why ask this question?

Following Climategate, Glaciergate and the repeated failure of Met Office’s seasonal forecasts, this is a question the public and commentators often ask rhetorically to argue that long-term climate predictions must be nothing more than guesswork.

An answer for the public

The failure of short-term prediction doesn't necessarily mean long-term forecasts won’t work but you need to be clear what’s being predicted on different time-scales.

An illustration is the Lorenz model: a simplified weather model which showed "sensitive dependence on initial conditions" (aka the "butterfly effect"): two initially very close states may diverge to very different later states.

Lorenz's model “flips” between two different states; when it flips or how long it stays in one state before going to the other is unpredictable. However, the probability of being in either state over a long period is entirely predictable.

Warm and cold winters and greenhouse gases

The two states might be interpreted as "cold" and "warm" winter"; while it isn't possible to predict whether a particular winter will be cold or warm, the proportion of each type of winter can be predicted.

When an extra "forcing" term is added to the Lorenz model, the system's flips are still unpredictable but the relative probabilities of the two states change in a predictable way. The forcing might be interpreted as the effect of greenhouse gases being added. You can’t say for sure whether this coming winter will be warm or cold but the model predicts that warm winters will be increasingly probable.

TP doesn’t mean that this is a realistic model. It’s only used to make the obvious point that even though you can’t predict the throw of a die, if thrown a lot of times, the probability that a particular number comes up is nonetheless predictable. Weather is a particular event while climate is the probability of particular events occurring.

Hurricane Fish and the butterfly

The butterfly effect in real weather is exemplified by Michael Fish’s 1987 misfortune. At that time the Met Office made one prediction, based on measured data, for any given time-scale. On October 15th, 1987, that single forecast on which MF relied, didn't predict the UK hurricane.

Nowadays people do ensemble forecasting – they use not just the actual measurements but a number of similar but slightly altered values – and sometimes also average over different weather models. All these forecasts are run and the probability of various future weathers is assessed.  Ensemble forecasts made using the data available to MF forecast predict, with significant probability, hurricanes in SE UK, as well as lots of other results. The initial conditions in October 1987 were unstable with a range of very different local predictions.

Forecast probabilities
If the public better understood probability, the Met Office could give the relative probability of different forecasts. Such predictions can be quite good. For example, ECMRWF predicts the probability of precipitation throughout Europe, then looks at places for which probability is, say, 60% to see whether the fraction which actually experience precipitation is 60%.  Predictions aren’t perfect but work well (to within ~2-3%) across the whole range of probabilities.

So the straightforward answer is that short-term failures of weather prediction are due to the perceived requirement of giving a single forecast rather than a range of forecasts with different probabilities.  If the probabilities can be accurately predicted then very long-range climate forecasts may still be possible.
Weather influences climate

But that was the easy message: the real relationship between weather and climate is subtler and more complex because initially small-scale weather effects may become amplified to affect climate. It’s not obvious beforehand when such complications will occur.

For example, using the same models that work well at predicting short-term weather probabilities, you can make regional seasonal forecasts and assess them in a similar way.  TP has found that predictions are OK for, for example, Amazon and Central America but completely wrong for Northern Europe, no better than chance. It seems that climate models may have systematic biases, for example persistent blocking anti-cyclones (as the UK has experienced for the past few months) are typically under-represented in climate models.

“Robust” prediction goes wrong

Should the reliability of ensemble forecasts matter for regional climate predictions? One school of thought holds that since the lifetime of a blocking anti-cyclone is very brief compared to, say, 100 years they don't matter.

TP thinks this may be simplistic.  For example, IPCC AR4 described a supposedly robust signal for future warmer and dryer European summers, typified by that of 2003. But these predictions used a model with a grid spacing of ~160km. When the calculations are repeated on a grid spacing of ~20km, the signal is much weaker and fragmentary. It seems as though the difference is due to the higher frequency of blocking a/c at the finer resolution.

The point is that climate models running on computers can’t include features smaller than their grid size but the actual climate may amplify the importance of such features so that they make a significant difference to climate predictions.

Possible fundamental problems

How much resolution is needed to capture climate change details? For example, convective instabilities (~km scale) aren't included in climate models; should they be? Does higher resolution reduce uncertainty? There’s no good theory for estimating how well climate simulations converge with increasing resolution.   Even worse, the equations themselves change with finer resolution as new features have to be included.

The underlying unknown is whether there is an irreducible level of uncertainty in the climate equations.
How do you test predictions of climate 100 years hence?

Obviously you can’t use the traditional method of comparison with real measurements.  You can only make and test relatively short-term predictions.  When these are accurate, with known limitations and the same models are used for 100-year prediction, then you may have confidence in the longer-term predictions.

We need bigger computers

All of which invites the question: does climate science have enough computing power to establish its own limitations?   Perhaps there’s a need for a “CERN for climate science”, something apparently to be proposed here…  …by a Dr Robert Bishop, president of something called the “International Centre for Earth Simulation Foundation”.  [I haven’t been able to find out any more about the ICESF so it may currently be only a glint in Dr Bishop’s eye.]



PrintView Printer Friendly Version

Reader Comments (50)

It would be much more interesting just to take a peep at Schrödinger's pussy...

Mar 10, 2010 at 2:41 PM | Unregistered CommenterKeith in Ireland

"Obviously you can’t use the traditional method of comparison with real measurements. You can only make and test relatively short-term predictions. When these are accurate, with known limitations and the same models are used for 100-year prediction, then you may have confidence in the longer-term predictions."

Non sequitur.

Mar 10, 2010 at 2:48 PM | Unregistered CommenterFrederick Davies

I prefer the old "indian" weather rock. It never fails. If it's wet, it's raining; if it's white, it's snowing; if it's it's rocking, it's windy; if I can't see it very well, it's foggy...

Mar 10, 2010 at 3:00 PM | Unregistered CommenterKevin

Following on from finally having agreed to stop making weather forecasts, the widely respected Met. Office has just issued a new statement:

The Met. Office regrets that they may have been a bit hasty in promising to make-up a new temperature record for the last 150 years within a period of about three years.

The Met. Office has recently sought advice from two computer/software experts in New York, who said that the Met. Offices's current fleet of 100,000 Sinclair ZX Spectrums is not up to the job of fiddling with hundreds and hundreds of perfectly good handwritten temperature records to produce whatever conclusion the Chairman of IPCC v. 2.0 wants this week, particularly as it might change the week after next.

They suggested that the Met. Office buys from NASA several thousand redundant 850 MB, 75 MHz computers, which it was emphasised, had done a reasonable job on the 1999 Mars Climate Orbiter project, notwithstanding the metric/imperial units cock-up, at the bargain price of 1,000 dollars per unit.

The Met. Office confirms that they have agreed to buy 50,000 machines from Jim and his mate Gavin. Gavin, who is a very nice man, will provide technical support from his own personal website. The Met. Office points out that, because the machines have been up-graded, free of charge, to the latest version of Windows 95 they will be immune from hacking because nobody is writing viruses now for this particular operating system, consequently whatever is said in e-mails will be secure and nobody will ever find out.

It will be necessary to reinforce the electrical supplies to the site from 2MW to 5GW and a further area of land the size of three football pitches will be required for the extensions. As soon as the Chairman of IPCC v. 2.0 announces what he wants us to show incontrovertibly, the Met. Office will get stuck in and will provide progress up-dates at three yearly intervals, which is of course what was meant in the first place, not to finish in three years, silly.

Mar 10, 2010 at 3:01 PM | Unregistered CommenterBrownedoff

What i find confusing about this debate over forecasting is this:

Even alarmists acknowledge that you can't satisfactorily forecast climate beyond a day or two. Hence they claim they are forecastnig climate.

Then how come we have so much alarmist research based on weather events. (increased intensity of tropical storms, more snow, less snow, more rain, less rain, more floods etc. etc.)?

Mar 10, 2010 at 3:07 PM | Unregistered CommenterGeckko

"We need bigger computers"
The usual disclaimer climate scientists utter in all their publications. Yet, the GIGO law springs to mind. What use do you have with more Multimillion $$$ Terraflop Super Crays when you don't know what to put in?
Until now all the funds in the past for new computers have only resulted in the knowledge that instead of 0.3 degrees tolerance we have now 0.2 degrees tolerance.

Mar 10, 2010 at 3:40 PM | Unregistered CommenterHoi Polloi

as a former computer scientist.

Modelling complex, chaotic, non -linear systems in computers AND expecting to be able to use them to extrapolate and make real world predictions....

Is impossible...

However big the computer is, however many people, however many millions/billions you through at it, it is impossible.

(look at what happened when this was tried for modelling risk (and that allowed actually allowed real feedback into the real world . bought sold on predictions.

This was a 'supposedly' much simpler non-linear, complex system to measure than climate.
It had all the 'genuine' rockect scientists, from the cream of universities, tempted by vast salaries into the city. MUCH more money spent.

But again a trend was followed, the models were developed to more closely match the trend..
The models converged as the trend kept going.

See where the analogy is going!?

Then the trend changed (credit crunch)

Mar 10, 2010 at 3:52 PM | Unregistered Commenterbarry woods

As an ex-academic physicist, I am saddened by the desire of these guys to do simulations all the time. Building a long-term model of the global atmospheric system requires a solid understanding of the relationship between C02, biomass, the seas, feedback effects, and much much more. Just because you can solve your equations at a surface resolution does not mean that everything else is correct. A model which ignores some unknown factor will be useless even if the equations are solved to a 1mm grid spacing !!

More physicists need to get away from their computer screens and design experiments which can be used to test and measure the climate. CERN is an experiment - not a simulator. The scientists at CERN do not know whether the higgs boson will be found. There is a theory to be tested. If the theory is wrong then it is wrong.

The same is true for climate. We do not need more simulations. We need experiments. We need scientists to get out of their offices and get their hands dirty. I am not a climatologist, but there must be things that can be done. There seems to be a lack of imagination here with everyone running to their computers.

Mar 10, 2010 at 3:53 PM | Unregistered CommenterDominic

Thanks for posting this! The computational challenge reminded me of the "Anthrax Project" which networked 3.5 million personal computers running "screensaver" software in their spare time. I've jotted down an outline for a similar system for climate modeling on my blog here:

How to Get Enough Computing Power for Climate Modeling

Mar 10, 2010 at 3:56 PM | Unregistered CommenterScott W. Somerville

Oh, yes, need much bigger computers, and international collaboration because no single country could afford to build it...nice toys if you can get them.

The Met Office's UKCP09 report claims to be able to able to do projections/predictions out to the end of the century within a 25km grid. This was not supposed to be anywhere near possible - Julia Slingo, President of the Royal Meteorological Society, said so repeatedly - before she was offered the job of Chief Scientist at the Met Office of course: now she is the spokesman for government propaganda, and projections that far out are a cinch. We can make up anything to order, so long as you promise to give us bigger and more expensive playthings.

But even before the Met had their new power-hungry number cruncher, Dr Vicky Pope was singing the praises of the Met office, which she said was routinely doing projection 1000 years hence.

Of course, if the projections don't look to be on track in 40 years time, the answer would be (1), forget it, we've got better models and computers now, and (2) that's too short a timescale: come back in a couple of hundred years and we'll see how the projection is shaping up then.

Either way, I should like to know why taxpayers money is being squandered to produce projections of climate for 1000 years hence. Clearly the Met has more interest in playing computer games in virtual reality than in doing anything useful.

See here:

Mar 10, 2010 at 4:00 PM | Unregistered CommenterScientistForTruth

Maybe if they knew what they would do with more computer power they could say how much computer power they need.

It just looks like an excuse from here.

Mar 10, 2010 at 4:01 PM | Unregistered CommenterJack Hughes

Perhaps there’s a need for a “CERN for climate science”,

I think there is, but I'm not talking about billion dollar projects. Climate science has been using trees and various isotopes as proxies for climate for decades. More than long enough to set up actual experiments to measure both the accuracy and influencing factors. It appears to me that the only experimentation climate scientists are prepared to invest in is with computer models. Its about time they went back to basics and actually started making measurements in the field.

Mar 10, 2010 at 4:19 PM | Unregistered CommenterTerryS

Thanks Brownedoff for your account of the MetOff's computerized climate science. Assuredly so far the most accurate (and hilarious).

About Tim Palmer's crave for supercomputer and his fallacious pretexts to justify more power, here is what Henk Tennekes, an insider to this industry, said after a correspondance with him :
“So you’re really lobbying for a massive computer facility”, I wrote, “you participate in the same song and dance that has annoyed me for so long”. In my years as Director of Research at KNMI, the scientists around me honestly felt that my only job was to promote the early purchase of the next supercomputer. They were eager to collude behind my back with the hardware crowd at KNMI and salesmen from computer manufacturers. This often resulted in seemingly attractive discounts being offered around October, just when the salesmen had heard through the grapevine that a budget surplus would soon be reported to the Management Team."

Given Tennekes testimony (3 years ago !), Palmer is just a fence-sitting opportunist trying to preserve his perks, like Judith Curry. The remarks made by Willis Eschenbach to Curry apply perfectly to Pamler.

Mar 10, 2010 at 4:26 PM | Unregistered CommenterJean Demesure

Exponential error in recursive models is like inflation eating away the purchasing power of a currency.

Over time the level of signal to noise drops and feedback effects prevail.

It's not just GIGO, it's NPIGO (Near perfect In Garbage out).

The further forward in time predicted, the more recursion happens and the more the error exponentially grows.

Mar 10, 2010 at 4:28 PM | Unregistered CommenterAC1

I don't think its lack of imagination Dominic. The easy and big growth funding in tax payer backed science these days is for climate change "research". How much of that goes into anything other than playing computer games or advocacy dressed up as science is, I suspect, very small.

Why? Maybe its the calibre of the climatologists, and their preference for the best funding $ for the least actual work. Or maybe its because the controllers of the tax funds aren't actually interested in any better answers, but just want scary predictions to drive political and economic agendas.

A satellite based experiment has been proposed by Freeman Dyson to measure the net radiative balance of the atmosphere. This would provide an empirical test of AGW theory.

As SFT has pointed out above, fantasy climate projections 100 and 1000 years out are really of no value to us today. And yet their cost is very real.

In a world of finite resources, this means their true value is actually strongly negative: what else could be funded today instead, that could make life on this planet better for us and our children, now and into the future?

Mar 10, 2010 at 4:39 PM | Unregistered CommenterDrew

Is this the best argument they’ve got for being able to predict the climate?

A counter example would be someone in London measuring temperatures from January to May predicts that by December everyone will be dead of heat stroke. By July the theory is almost a certainty. It still doesn’t make the prediction right.

You can only create accurate trends once you know all the variables and week after week the climate community prove they’ve got a very poor grasp on the starting conditions, let alone the variables. The planet has seasons that last much longer than 12 months and they’ve not identified more than a fraction.

How can they predict Earth’s climate without even being able to predict something as simple as the sun’s sunspot cycle? Even if the response is very small, it’s a response (eg the effects on the flow of the Parana river) and those tiny effects add up. Let’s not even go into clouds, ocean currents, solar wind, etc. What about effects from new variables like a regreening of the Sahara or an accurate picture of how CFCs have affected the climate?

Then there’s the instrument temperature record. If you’re trying to recreate the effects of a climate variable to an actual response (eg SO2 from volcanic eruptions), how accurate do the temperature records need to be when you’re trying to pick up influences of a fraction of a degree? How accurately do you need to measure the influences themselves? ie does one scientist’s guestimate for sulphur released become a climate scientists vital input?

They should be able to accurately reproduce the climate for the last few hundred thousand years more easily than the climate for the next ten. I don’t see them issuing hind casts for geologists and glaciologists to compare their findings with. Have they sent Mr Mann an email – sorry mate, you were wrong about MWP?

The public do understand probabilities better than this guy thinks, they know he’s talking >95% bo££*(&s.

I’m sorry, computer models are not reality, not matter how realistic they seem. If they were then World of Warcraft would be a documentary.

Mar 10, 2010 at 4:49 PM | Unregistered CommenterTinyCO2

Jean, thank you for that reference to Tennekes - a piece I've admired for at least three years but that I'd forgotten referred to Palmer. I think it's important to note what he says at that point:

Is this [a gigantic supercomputer] what John Houghton, Bert Bolin, Martin Rees, the IPCC staff and the like are aiming for? I have parted the company of these power brokers many years ago, so I cannot begin to imagine what they are up to this time. Palmer has convinced me he is not their puppet, fortunately.

Not a great diplomat, old Tennekes - one of his greatest strengths! This would be broadly in line though with Simon Anthony's verdict from the lecture. Not their puppet. It's a corrupted field within which he works though - I've been convinced that Lindzen is dead right about that. So how much of this to trust? I found the example of the Lorenz model helpful. But of course it doesn't remotely prove that the real climate behaves anything like something as simple (yet chaotic) as a Lorenz model. Only experiment can begin to indicate that. Quite probably over a substantial period.

How much more supercomputing is needed by these guys is hard to judge. Perhaps such technical gizmos should begin to be rationed according to the amount of genuine physics - hypotheses and measurement - being done in the field?

Mar 10, 2010 at 5:03 PM | Unregistered CommenterRichard Drake

CERN already has the CLOUD experiment underway

It's looking at a branch of climate science ignored for funding as it does not set out to prove
C02 as the prime climate force. Independent funding had to be found, which is a scandal in itself.
As the IPCC models virtually ignore/can't handle the Sun/Cloud generation scenario it is not surprising that they go awry in a very short timescale.

Do not buy any more computers - they will only take you to an incorrect conclusion more quickly.

As Prof Brian Cox put it in his BBC Universe series recently - "we know little about how the sun
influences our climate" - it's about time we did before we make any more stupid plans.

Mar 10, 2010 at 5:15 PM | Unregistered Commenterjazznick

"We need bigger computers". Alternatively, they might want to look at how other forecasters work, especially if these others already get good results without the sort of Deep Thought-like super computer the Met Office appear to be certain they need. Accuweather, maybe? I find that Joe Bastardi talks plain common sense, and his European blog is always worth a glance:

Mar 10, 2010 at 5:28 PM | Unregistered CommenterAlex Cull

It's OK your grace. Lord Rees of the "British Society" (I think the BBC meant Royal Society) was interviewed on the BBC Radio 4 Today programme this morning

Apparently, we're all being too sceptical and the press is getting too excited playing up one or two careless incidents at the IPCC (whose science core is absolutely sound) and so on. So what was the whole kerfuffle about? The panic is officially over and climate science can go back to business as usual: not science, of course - God forfend - but certainly business!

BTW, I've never seen (not that it hasn't happened before without my knowledge) an "extended" version of a broadcast interview being posted to the BBC Today website. I've certainly never seen an extended version of a sceptic's opinions nor, indeed, of anyone opinions if those opinions differ from the "settled" opinions of the BBC on matters like immigration, jihadist terrorism, crime statistics, smoking, alcohol abuse, education, Obama, Israel etc etc.

Mar 10, 2010 at 5:52 PM | Unregistered CommenterUmbongo

Good luck with throwing giga-flops at the problem, boys. Theory says it's inherently unpredictable, but don't let a little quibble like that stop your funding drive...

Mar 10, 2010 at 5:59 PM | Unregistered Commentermojo

Brownedoff said ....

Brilliant, Mr B.

Mar 10, 2010 at 6:15 PM | Unregistered Commenterdoobie
breaking news on the IPCC "independent" review

Mar 10, 2010 at 6:50 PM | Unregistered CommenterEdBhoy

As Edward Lorenz proved in 1964, complex dynamic systems --those with three or more interacting variables-- are inherently unpredictable, "non-random but indeterminate", due to sensitive dependence on initial conditions (the Butterfly Effect). Lorenz's Chaos Theory combined with Benoit Mandelbrot's Fractal Geometry (1974) render linear extrapolations from atmospheric systems [Lorenz was a meteorologist] impossible both in mathematical principle and real-world physical practice. Though cyclical phenomena enable one to gauge potential outcomes by assessing peaks and troughs, abrupt transitions inevitably separate regimes. There is in fact no way of predicting even relatively near-term circumstances, any more than Newton's "three-body problem" makes gravitational dispositions stable over time.

At this late date, so-called climatologists had better recognize that their vaunted "models" (sic) are intrinsically, irremediably, incapable of making any valid projections whatsoever. "Climate studies" is not an empirical, experimental discipline, but a classification exercise akin to botany-- interesting in hindsight but otherwise utterly obscure. Biological mutations occurring randomly are a precise analogy to climatological exigencies-- as Lorenz himself famously said, over geological time-spans Earth truly does not have a climate. (Think "snowball Earth" vs. 100-million year post-Cambrian tropical venues, due primarily to plate tectonic shifts rather than to any astronomical or atmospheric perturbations.)

Once one accepts that Climate Cultists' deviously plotted outputs are prima facie meaningless, Warmists' resort to chicanery and fraud is readily apparent: Frankly, junk-science manipulative tools are all they have. As Ehrlich, Hansen, and their ilk descend to agitprop on behalf of their irrefutably discredited hypotheses, the game is up. Increasingly shrill bleats and squeaks expose them ever more for what they are-- death-eating Luddite sociopaths bent on sabotaging, subverting, global energy economies at any cost. Beyond partisan labels, Green Gangsters' blighted attitudes evince deep-seated psychological debilities intent on dragging all humanity with them to Abyss.

Mar 10, 2010 at 6:51 PM | Unregistered CommenterJohn Blake

What is the point of suggesting the Met Office publish forecasts with probabilities? Take the winter - it will either be colder than average, average or warmer than average. If each outcome just is assigned a probability as a %, the predictor is never wrong. In fact they are not making a prediction at all. Whatever happens, they will have 'predicted' it.

I will offer such a service, at a fraction of the cost of the Met Office. I will consult the runes, kill a few chickens and consult the entrails, and produce a few probabilities. I reckon I will be no less accurate than the Met and I'll only charge the taxpayer £100K per annum. Deal?

Mar 10, 2010 at 7:21 PM | Unregistered CommenterJim

"We need bigger computers"

Presumably Piers Corbyn at WeatherAction nabbed them first so the Met office has had to soldier on with their 100,000 Sinclair ZX Spectrums.

Larger computers will just produce more garbage or the same garbage quicker..

Mar 10, 2010 at 7:25 PM | Unregistered CommenterFranks

This is OT but merits discussion. In particular, has the horse already departed the stable or is there time to insure a diverse group of reviewers and a sufficiently board charge.

The following is a press release from the InterAcademyCouncil that was posted by Andrew Revkin on DotEarth and at WUWT. As with any of the reviews, it all depends upon the panel’s charge and its membership.

Date: March 10, 2010



AMSTERDAM, Netherlands -- The InterAcademy Council (IAC), a multinational organization of the world's science academies, has been requested to conduct an independent review of the Intergovernmental Panel on Climate Change (IPCC) processes and procedures. The study comes at the invitation of the United Nations secretary-general and the chair of the IPCC, and will help guide the processes and procedures of the IPCC's fifth report and future assessments of climate science.

The IAC has been asked to establish an ad hoc Independent Evaluation Group (IEG) of experts from relevant fields to conduct the review and to present recommendations on possible revisions of IPCC practices and procedures. In addition, the IEG is asked to recommend measures and actions to strengthen the IPCC's capacity to respond to future challenges and ensure the ongoing quality of its reports.

Founded in 2000, the IAC was created to mobilize top scientists and engineers around the world to provide evidence-based advice to international bodies such as the United Nations and World Bank -- including preparing expert, peer-reviewed studies upon request. The IAC Board is composed of the presidents of 15 academies of science and equivalent organizations -- representing Argentina, Australia, Brazil, China, France, Germany, India, Indonesia, Japan, South Africa, Turkey, the United Kingdom, and the United States, plus the African Academy of Sciences and the Academy of Sciences for the Developing World (TWAS) -- and representatives of the InterAcademy Panel (IAP) of scientific academies, the International Council of Academies of Engineering and Technological Sciences (CAETS), and the InterAcademy Medical Panel (IAMP) of medical academies. The IAC Secretariat is hosted by the Royal Netherlands Academy of Arts and Sciences (KNAW) in Amsterdam. The IAC Board has final approval authority over conducting and publishing IAC studies.

The IAC is currently led by two co-chairs, Robbert Dijkgraaf, president of the Royal Netherlands Academy of Arts and Sciences, and Lu Yongxiang, president of the Chinese Academy of Sciences. Following IAC board approval of the review, the IAC co-chairs will appoint members of the IEG after a vetting process to assure their expertise, balance of perspectives, and absence of conflicts of interest. They will be volunteers who serve PRO BONO; only their travel and meeting expenses will be paid. Participants in the IEG will not be under obligation to any government, the IPCC, or the United Nations. The IAC and IEG will receive financial support for their work from the United Nations. Because work on the Fifth Assessment of IPCC has already commenced, the IEG has been asked to deliver its findings by Aug. 31, 2010.

Robbert Dijkgraaf said he was pleased to be representing the world's scientists and science academies. "The InterAcademy Council," he said, "is prepared to take on the challenge of this important review of the work and processes of the Intergovernmental Panel on Climate Change. Our goal will be to assure nations around the world that they will receive sound, definitive scientific advice on which governments and citizens alike can make informed decisions."

Lu Yongxiang recalled that when the IPCC was created by the World Meteorological Organization and the United Nations Environment Programme in 1989, its charge was to provide scientific and comprehensive information about climate change. "With this review," he said, "the IAC will carefully examine the IPCC's procedures, processes, and types of products to ensure that climate change issues will be scientifically presented and solid science-based recommendations will be provided in future IPCC assessment reports."

"I welcome Secretary-General Ban Ki-moon's decision," said Ralph J. Cicerone, IAC board member and president of the U.S. National Academy of Sciences, "to recruit experts from the world's science community for this independent review of the IPCC, examining both its strengths and any areas where changes may be needed to produce the best possible assessments of climate science."

Lord Martin Rees, IAC board member and president of the Royal Society, said, "Climate science is inherently complex, integrating many different disciplines and kinds of data. The IPCC's role in assessing and expounding the latest scientific findings is getting ever more important. This independent review of its procedures is timely and important, as an aid to ensuring that future reports, which will assess new and updated research, are optimal resources for making sense of climate change and helping policymakers respond to it."


Amsterdam: John Campbell, IAC Executive Director
Kloveniersburgwal 29
1011 JV Amsterdam
The Netherlands

Contact: William Kearney, Director of Media Relations
U.S. National Academy of Sciences
Phone: +1 202 334 2138

Web site:\

# # #

Mar 10, 2010 at 7:48 PM | Unregistered CommenterRayG

Perhaps we should submit a research grant application to determine the CO2 footprint and climate impact of the hardware production, billions of computing hours, related power consumption, climatologist modellers cups of coffee, endless farting etc.

And for a profitable sideline, we can quickly knock up something for the advocacy grey literature on the dangerous environmental impact of the exponentially increasing anthropogenic emissions of discarded obsolete computer hardware in the race to model the 100 year forward climate.

Mar 10, 2010 at 7:57 PM | Unregistered CommenterDrew

Something I should have included... Prof Palmer illustrated the agreement between the predictions of precipitation with a chart showing how well the predicted and actual probabilities agreed. As I said: agreement was good. As I should have added: all the errors were in one direction - all the predicted probabilities were slightly greater than the actual probabilities. The models overestimated the probability of precipitation one hundred percent of the time.

Prof Palmer was asked about it and replied that it was a systematic error. He didn't give the impression that he thought it was significant.

I was too busy taking notes to think, so I missed following up by asking (I'm sure you're there before me) whether the models might also have a systematic temperature bias. While doing the precipitation calculations, it must have been tempting to compare the computer models' probability of a temperature being higher than the long-run average vs the actual occurrences. I wonder whether they did.

Mar 10, 2010 at 8:48 PM | Unregistered CommenterSimon Anthony

As Edward Lorenz proved in 1964, complex dynamic systems --those with three or more interacting variables-- are inherently unpredictable,

And yet actuaries can tell you population life expectancy and travel agents can book plane loads of people on sun and skiing holidays, months in advance, year in year out.

Yep, that climate is predictable probably wouldn't even be controversial if some people didn't need to deny it for ideological reasons.

Or if, you know, people expected something about the world climate to change in an almost unprecedented manner for some reason.

Mar 10, 2010 at 8:55 PM | Unregistered CommenterFrank O'Dwyer

And they do it without supercomputers or bill to the taxpayer to, and when they get it wrong one way, they'll go bust.

There's a difference between putting your own money where your mouth is and offering a service for people to take up, compared to expecting tax payer funding, then demanding that all are compelled to take part in a social experiment of your design.

Mar 10, 2010 at 9:43 PM | Unregistered CommenterKeith

in what way is people going on holiday a good analogy with climate prediction?

Mar 10, 2010 at 9:58 PM | Unregistered CommenterSebastian Weetabix

apologies for the poor grammar. I have been enjoying a cheeky burgundy.

Mar 10, 2010 at 10:01 PM | Unregistered CommenterSebastian Weetabix


Thanks for your input to this thread.

The Prof claimed a good but exaggerated match for precipitation. I wonder if any of the other predictions (like temperature, wind, etc) matched at all - or was he just putting a brave face on failure?

It all looks like the "texas sharp-shooter" fallacy. The Prof has fired a gun and now he's painting target rings round the bullet hole.

We are heading into horoscope land....

Mar 10, 2010 at 10:36 PM | Unregistered CommenterJack Hughes

From Simon's report

IPCC AR4 described a supposedly robust signal for future warmer and dryer European summers, typified by that of 2003. But these predictions used a model with a grid spacing of ~160km. When the calculations are repeated on a grid spacing of ~20km, the signal is much weaker and fragmentary. It seems as though the difference is due to the higher frequency of blocking a/c at the finer resolution.

I guess Prof Palmer was referring to something like this - prediction of more dry days in southern Europe (the legend indicates that at least 5 out of 9 models classed the change in that area as statistically significant). So if that significance goes away if the spatial resolution is increased - what other IPCC predictions will have to be revised?

Mar 10, 2010 at 11:14 PM | Unregistered CommenterDR


in what way is people going on holiday a good analogy with climate prediction?

Well it seems that many people want to go to sunny or snowy climes, and other people are making a living predicting when they should go, and the gps co-ordinates they should go to. It also seems that crowds of foreigners have been building airports and resorts in those locations decades before the people arrive there.

It's uncanny, isn't it, and all without the aid of a computer. Perhaps they are psychic?

Mar 11, 2010 at 7:33 AM | Unregistered CommenterFrank O'Dwyer

Fairly small number of variables there, Frank, on short timescales. Doesn't sound too complex to me. Perhaps it's a bit different from determining the global climate in 100 years after all. And belief in the arrival of tourists doesn't require them to tax me to death or return to the stone age courtesy of acme patent bird slicers in place of reliable base load coal, oil & gas-fired power stations.

The jig is up for the AGW scam mate.

Mar 11, 2010 at 8:20 AM | Unregistered CommenterSebastian Weetabix

Frank, you are in the camp sermonizing skeptics for confusing meteo and climate when they say that 50 year climate predictions are junk given the abysmal seasonal forecasts.
Yet, you now compare 50 year climate predictions to (one year) resorts booking forecasts.
Why am I feeling you're using flagrantly self-serving arguments ?

Mar 11, 2010 at 8:25 AM | Unregistered CommenterJean Demesure

Frank - Actuaries can predict mortality because they have centuries of data and because they are basing their studies on populations of millions of people so that the standard error of the rates they want to predict are tiny - the signal is much larger than the noise.

Because of this, the factors which drive mortality have been studied over decades and are well understood such as smoking, genetic factors e..g heart disease, diet, quality of healthcare etc...

They do get it wrong since it is impossible to predict the future perfectly. For example, no-one can predict exactly what advances future medical discoveries will produce. However there is a good understanding of many of the other factors which are pretty static. So only some of the factors are non-static and these are the ones which cause the predictions to be faulty. As a result, the predictions are pretty good.

The main point is that human mortality is calculated at a population level and is not a non-linear chaotic system. There is however only one climate system and it is a non-linear chaotic system and we do not even know all of the drivers let alone understand them and their interactions with other drivers. The comparison is false.

Mar 11, 2010 at 8:50 AM | Unregistered CommenterDominic

There's an even more fundamental problem with Frank's analogy with life expectancy or holiday bookings: we know what these things are but we don't know what climate is.

Weather is something we know from experience. It's very complex as we look into it. One thing that caught my imagination recently is the thought that rain is 'Earthquakes in the Sky'. The study of weather is I'm sure going to spring some surprises in the next hundred years. But at least we know what we're studying.

Climate - no. There's an arbitrary figure of thirty years plucked out of the troposphere over which it is said weather becomes something else that is predictable. But there's no empirical evidence of that. So, having duly encouraged us with our ability to forecast something as well-defined as life expectancy in Cheam or the number of plane seats needed to Verbier next winter, Frank's next sentence:

Yep, that climate is predictable probably wouldn't even be controversial if some people didn't need to deny it for ideological reasons.

... well, that's extraordinary. What do you mean by climate? Saying 'climate is predictable' is like a line from Jabberwocky for me. Perhaps then it cannot be controversial. Yes, it may be impossible to deny, for ideological or any other reason. It's simply nonsense. Not even wrong, as Wolfgang Pauli once famously said.

Frank, you need to get specific. The average level of GATA over thirty years or whatever. You might be wanting to place a bet that this abstract (and obscure) statistic might turn out to be predictable. I'm not myself committing to the notion that nothing will ever be predictable in the 30, or 300, or 30,000 year level. We just don't know.

This I will say. Many of us will not live to see the day when we do have any idea of what stat over what time period is turning out to be predictable. I'm no actuary but you get that forecast for free.

Mar 11, 2010 at 12:30 PM | Unregistered CommenterRichard Drake

The world's climate cannot be put inside a box, the future cannot be predicted - even with a computer as big as the Earth. Read James Gleick's book Chaos; especially the part about the work of Edward Lorenz - way back in 1963! Are geologists the only scientists who know anything about history? Computer programmers should remember that the way to tell if a program is working properly is when it gives the results that are expected. Climate models are computer programs.

Mar 11, 2010 at 1:02 PM | Unregistered CommenterAngus - a geologist and programmer

But Angus, iirc, doesn't TV's preferred geologist Ian Stewart make endless prime time programmes telling us the science is settled? And that anyone who doubts it is mad or scientifically illiterate?

I do note that every time he wants to tell us this he does insist on flying 20 times around the planet with an enormous BBC TV crew in his wake. Perhaps he just wants to make doubly sure every bit of MMCO2 gets into play to give his favourite computer Super Models the best chance.

Your point about knowledge of history is an interesting one. From my own experience as one who long ago dabbled in mathematical physics, I think it might be quite rare that science in general is taught together with its history, which is a shame, as much can be learned about the nature of scientific progress from its developmental context.

Today's climatologists, their advocates and apologists for consensus science would do well to learn a bit more history, a bit more science, and the history of scientific consensus. Oh, and a lot more maths would help as well.

But of course, as we all know, its really not about the science.

Mar 11, 2010 at 1:51 PM | Unregistered CommenterDrew

I think it is the English speaking scientists who have little grasp of the history of their sciences. The continental scientists often seem to have a much better grasp of the historical basis of their subjects.

Geologists have a comparatively good grasp of their science' history due to its' unifying theory having been found so recently, and in part due to the great controversies in the science' past:
Neptunists vs Vulcanists, Catastrophism vs Uniformitarianism, The Scottish Highlands and Old Red Sandstone controversies...

Even the Science' main originator, Smith, received very poor and vindictive treatment at the hands of the "Geological Society" (at that time a "Gentleman's" dining club).

At one point, "Continental Drift" was down to I think a single high profile supporter, Arthur Holmes, Prof of Geology at Durham, and so badly funded, he supplimented his pay by running a rock shop in Newcastle.

Mar 11, 2010 at 3:05 PM | Unregistered CommenterKeith in Ireland

You may be correct Keith. But on the other hand, the AGW bandwagon is nothing if not international.

Personally I find the history of great ideas and the lives of their creators fascinating. The story behind the discovery brings each subject to life and often leads to insights when tackling new problems. And more often than not I've found the most able thinkers in any discipline also highly knowledgeable of their subject's history and its cast of characters.

Angus' original point is a good one, as there does appear to be a concerted movement to forget, either deliberately or through ignorance, the hard lessons from history in many walks of life: politics, economics and climatology immediately spring to my mind.

Mar 11, 2010 at 5:55 PM | Unregistered CommenterDrew


There's an even more fundamental problem with Frank's analogy with life expectancy or holiday bookings: we know what these things are but we don't know what climate is.

You don't? What's the climate of the Costa del Sol? If you were going there for a couple of weeks in August would you pack polar fleece or light clothing?

It's a toughie, isn't it. Feel free to use a calculator.

Mar 11, 2010 at 11:19 PM | Unregistered CommenterFrank O'Dwyer

Here is a link to basic info re: ICESF....

Mar 12, 2010 at 1:38 AM | Unregistered Commenterdave2

We dont need a super super computer. I have worked out what the temperatures will be 100 years hence.

The temperatures will be the same as today, unless they will be different, in which case they will either be cooler or else they will be warmer.

I did these calculations on the back of me hand.

Mar 13, 2010 at 4:25 AM | Unregistered CommenterRichard

Army Officer

Jan 15, 2011 at 8:11 PM | Unregistered CommenterVan Balani

Army Officer

Jan 15, 2011 at 11:39 PM | Unregistered CommenterVan Balani

I agree with your opinion of greenhouse gases.Aside from water vapor, which has a residence time of about nine days, major greenhouse gases are well-mixed, and take many years to leave the atmosphere. Although it is not easy to know with precision how long it takes greenhouse gases to leave the atmosphere, there are estimates for the principal greenhouse gases.

Feb 23, 2011 at 12:40 PM | Unregistered CommenterCash for Platinum

PostPost a New Comment

Enter your information below to add a new comment.

My response is on my own website »
Author Email (optional):
Author URL (optional):
Some HTML allowed: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <code> <em> <i> <strike> <strong>