Seen elsewhere



Click images for more details



Recent comments
Recent posts
Currently discussing

A few sites I've stumbled across recently....

Powered by Squarespace
« Debent | Main | Scotland's green energy policy in the balance »

Lewis on Schmidt on climate sensitivity

This is a guest posting by Nic Lewis. Nic has cross posted this to the comments at RC, with the normal style of response from Schmidt.

Gavin Schmidt

I am glad to see that my input into the Wall Street Journal op-ed pages has prompted a piece on climate sensitivity at RealClimate. I think that some comment on my energy balance based climate sensitivity estimate of 1.6–1.7°C (details at, which underpinned Matt Ridley's WSJ op-ed, would have been relevant and of interest.

You refer to the recent papers examining the transient constraint, and say "The most thorough is Aldrin et al (2012). … Aldrin et al produce a number of (explicitly Bayesian) estimates, their ‘main’ one with a range of 1.2°C to 3.5°C (mean 2.0°C) which assumes exactly zero indirect aerosol effects, and possibly a more realistic sensitivity test including a small Aerosol Indirect Effect of 1.2–4.8°C (mean 2.5°C)."

The mean is not a good central estimate for a parameter like climate sensitivity with a highly skewed distribution. The median or mode (most likely value) provide more appropriate estimates. Aldrin's main results mode for sensitivity is between 1.5 and 1.6°C; the median is about halfway between the mode and the mean.

I agree with you that Aldrin is the most thorough study, although its use of a uniform prior distribution for climate sensitivity will have pushed up the mean, mainly by making the upper tail of its estimate worse constrained than if an objective Bayesian method with a noninformative prior had been used.

It is not true that Aldrin assumes zero indirect aerosol effects. Table 1 and Figure 15 (2nd panel) of the Supplementary Material show that a wide prior extending from -0.3 to -1.8 W/m2 (corresponding to the AR4 estimated range) was used for indirect aerosol forcing. The (posterior) mean estimated by the study was circa -0.3 W/m2 for indirect aerosol forcing and -0.4 W/m2 for direct. The total of -0.7 W/m2 is the same as the best observational (satellite) total aerosol adjusted forcing estimate given in the leaked Second Order Draft of AR5 WG1, which includes cloud lifetime (2nd indirect) and other effects.

When Aldrin adds a fixed cloud lifetime effect of -0.25 W/m2 forcing on top of his variable parameter direct and (1st) indirect aerosol forcing, the mode of the sensitivity PDF increases from 1.6 to 1.8. The mean and the top of the range goes up a lot (to 2.5°C and 4.8°C, as you say) because the tail of the distribution becomes much fatter - a reflection of the distorting effect of using a uniform prior for ECS. But, given the revised aerosol forcing estimates given in the AR5 WG1 SOD, there is no justification at all for increasing the prior for aerosol indirect forcing prior by adding either -0.25 or -0.5 W/m2. On the contrary, it should be reduced, by adding something like +0.5 W/m2, to be consistent with the lower AR5 estimates.

It is rather surprising that adding cloud lifetime effect forcing makes any difference, insofar as Aldrin is estimating indirect and direct aerosol forcings as part of his Bayesian procedure. The reason is probably, because the normal/lognormal priors he is using for direct and indirect aerosol forcing aren't wide enough for the posterior mean fully to reflect what the model-observational data comparison is implying. When extra forcing of -0.25 or -0.5 W/m2 is added his prior mean total aerosol forcing is very substantially more negative than -0.7 W/m2 (the posterior mean without the extra indirect forcing). That results in the data maximum likelihoods for direct and indirect aerosol forcing being in the upper tails of the priors, biasing the aerosol forcing estimation to more negative values (and hence biasing ECS estimation to a higher value).

Ring et al. (2012) is another recent climate sensitivity study based on instrumental data. Using the current version, HadCRUT4, of the surface temperature dataset used in a predecessor study, it obtains central estimates for total aerosol forcing and climate sensitivity of respectively -0.5 W/m2 and 1.6 °C. This is a 0.9°C reduction from the sensitivity of 2.5°C estimated  in that predecessor study, which used the same climate model.  The reduction resulted from correcting a bug found in the climate model computer code.  (Somewhat lower and higher estimates of aerosol forcing and sensitivity are found using other, arguably less reliable, temperature datasets.)

PrintView Printer Friendly Version

Reader Comments (87)

Could we please avoid making this thread about Gavin Schmidt's boorishness.


Jan 12, 2013 at 8:06 PM | Registered CommenterBishop Hill

It's going to be a very short thread if we just talk about his integrity.

Jan 12, 2013 at 8:10 PM | Unregistered Commenterwat dabney

Wat dabney

Wine on my computer screen thanks to you...

Jan 12, 2013 at 8:12 PM | Registered CommenterBishop Hill

Nice comment and Gavin seems to have assiduously avoided your key points. Is it me or is Gavin really hedging his bets in his last paragraph? To wit:
In the meantime, the ‘meta-uncertainty’ across the methods remains stubbornly high with support for both relatively low numbers around 2ºC and higher ones around 4ºC, so that is likely to remain the consensus range. It is worth adding though, that temperature trends over the next few decades are more likely to be correlated to the TCR, rather than the equilibrium sensitivity, so if one is interested in the near-term implications of this debate, the constraints on TCR are going to be more important.

Jan 12, 2013 at 8:24 PM | Unregistered Commenterbernie

Does anyone understand what Schmidt is getting at here?

All the pdfs are skewed - but using the mode to compare to the mean in previous work is just a sleight of hand to make the number smaller. The WSJ might be happy to play these kinds of games, but don't do it here.

In his original article Nic only referred to ranges I think. Is Schmidt making stuff up?

Jan 12, 2013 at 8:31 PM | Registered CommenterBishop Hill

[Snip - let's not go there...]

Jan 12, 2013 at 8:31 PM | Unregistered Commenterdearieme

I think Gavin is caviling that prior estimates of climate sensitivity typically are couched in terms of the mean value, so he doesn't want Nic to compare those to a modal value. In other words, he's trying to ignore Nic's "The mean is not a good central estimate for a parameter like climate sensitivity with a highly skewed distribution. The median or mode (most likely value) provide more appropriate estimates."

Jan 12, 2013 at 9:03 PM | Registered CommenterHaroldW

Jan 12, 2013 at 8:31 PM | Bishop Hill

Is Schmidt making stuff up?

I continue to count myself among the statistically-challenged, so would never pretend to understand "the science"; however, in the "making stuff up" department, I have observed that Schmidt certainly has form. [Examples available on request]

Jan 12, 2013 at 9:13 PM | Registered CommenterHilary Ostrov

Yes, I think any normal human being would have read it as a suggestion that it would be scientifically more appropriate to talk modes or medians rather than means. I'm just bemused that it would elicit the response it did.

Jan 12, 2013 at 9:14 PM | Registered CommenterBishop Hill

Gavin is basically demonstrating his ignorance or ideology. His point about making the number smaller works both ways: using them mean makes the number larger. He uses past experience as justification, which is silly. If both the median and mode are more likely, they are more useful. As a mathematician, he should know this.


Jan 12, 2013 at 9:17 PM | Unregistered CommenterMark T

Bernie - I'm not sure whether Gavin Schmidt is hedging his bets, or what.
Transient Climate Response (TCR) is a composite parameter reflecting equilibrium/effective climate sensitivity, effective ocean diffusivity and the effective depth of the ocean mixed layer, all much purer parameters each with an underlying physical basis, albeit not an exact fixed one. TCR is defined by reference to linearly increasing forcing over a 70 year period, and is not the same for periods significantly shorter or longer than 70 years. So I don't think it is a very good parameter to use. Myles Allen, and various other people, disagree.

Jan 12, 2013 at 9:17 PM | Unregistered CommenterNic Lewis

He does seem to accept
1. That results are skewed by use of uniform priors
2. That he was wrong about Aldrin's use of indirect aerosol effects

He doesn't seem to dispute that modes and medians are more appropriate than means either.

Jan 12, 2013 at 9:17 PM | Registered CommenterBishop Hill

I imagine the dopes at SkS have been busily trying to crank out something sufficiently propagandistic on this topic for the last week...

Jan 12, 2013 at 9:27 PM | Unregistered Commenterdiogenes

"means and medians", should this be "modes and medians"?

[Already fixed!]

Jan 12, 2013 at 9:27 PM | Unregistered CommenterEddy

When a distribution is skewed toward the upper end (as climate model sensitivities are) the mode will be lower than the mean.

This assumes the skewed distribution of sensitivities out of the climate models has some basis in reality, and isn't an artifact of the models (and modellers bias).

BTW, Gavin has admitted in the past that climate model projections are really just the quantified opinions of climate modellers.

Jan 12, 2013 at 9:28 PM | Unregistered CommenterPhilip Bradley

Andrew: re means, modes and ranges.
I've no idea what Gavin Schmidt is referring to when he writes "to compare to the mean in previous work". The only comparison to previous work I gave in my comment was for Ring et al 2012. I just used the old and new central estimates Ring gave. It doesn't say whether they are means, medians or modes.

Maybe GS is referring to the IPCC's current best estimate of equilibrium/effective climate sensitivity (ECS), in the AR4 report. A comparison to that estimate, of 3°C, was given in Matt Ridley's WSJ, and my detailed article also referred to the IPCC stating that a 3°C estimate was (still) supported by the observational evidence. But GS is on very shaky ground if he claims the IPCC figure was a mean. Box 10.2: Equilibrium Climate Sensitivity in AR4 WG1 gives central estimates for all 15 ECS estimates for which they were available, including from GCM simulations as well as observational studies. In every case, the mode and median were given. In no case was the mean reported.

My original article gave both the 5-95% range and the mode (most likely estimate) for Aldrin. As I wrote, I don't think the mean is a good central measure for highly skewed distributions, particularly when the tail may be excessively fat. The tail of Aldrin's PDF for ECS is very probably too fat, as he used a uniform prior distribution for estimating ECS. Aldrin gave neither the mode nor the median (which ECS has an equal probability of being above or below, if the distribution is correct). If the estimated distribution is accurate, then the median would be a good central measure. I quoted the mode partly because I could measure it easily and accurately off his results graph, whereas deriving the mean is more involved. But the mode also has the advantage that it is much less affected than the median or mean by the use of an informative prior (which a uniform distribution is here).

Jan 12, 2013 at 9:34 PM | Unregistered CommenterNic Lewis


Hasn't Annan in the past attacked the concept of using a uniform prior when calculating sensitivity? Wouldn't you expect some kind of comment to this effect on the hallowed RealClimate site?

Jan 12, 2013 at 9:45 PM | Unregistered Commenterdiogenes

The temperature change for a doubling of CO2 seems to be decreasing with each new paper. Obviously the overwhelming positive feedback to temperature change hypothesis is not holding true.

This doesn't seem to stop the advocates though as I noticed on both Tamino's blog as well as SKS they are celebrating the success of IPCC projections by using their own graphs and not the one leaked in the AR5 SOD. Tamino actually adjusts the observed baseline to further the point.

As far as sensitivity goes, Lindzen had an interesting point in that he said that we had already experienced an increase in greenhouse gases that was effectively equal to a doubling of CO2 and that the globe had not warmed per the advocates projections. I believe he was corrected as it was not quite a doubling but still the world had not warmed per the predictions. I personally have never been able to get my head around the thought that the climate reacts to an increase in temperature by increasing the temperature.

Jan 12, 2013 at 9:46 PM | Unregistered CommenterEric H.

Yes, James Annan has indeed attacked the use of a uniform prior when estimating climate sensitivity. As I recall, he initially tried to do so by way of a comment on the Frame et al (2005) paper, but could not get the journal to publish it. So he had to write a full paper, dealing with other things as well.

The real problem is that very few people involved in climate science have any real understanding of objective Bayesian methods, which require the use of a noninformative prior. Consequently, they (and readers of their papers) can have little understanding of the extent to which the priors they have chosen bias their estimation of climate system parameters. I am doing what I can to to help remedy this, along with one or two others. Unfortunately, one the very few published climate science papers about noninformative priors, which relates to estimating climate sensitivity, is badly wrong.

BTW, I believe James Annan is a subjectivist Bayesian. Subjectivist Bayesians appear to believe the probability is a personal belief. IMO science needs objective estimates, not personal beliefs.

Jan 12, 2013 at 10:04 PM | Unregistered CommenterNic Lewis

If climate sensitivity to increasing CO2 maxed out at around 240 ppm which ice core records suggest then current sensitivity is zero, hence no warming for 16 years.
Science needs evidence not estimates.

Jan 12, 2013 at 10:05 PM | Registered CommenterDung

the Bishop he say...

Wat dabney

"Wine on my computer screen thanks to you..."

With respect for those not blessed to be least it isn't whiskyyy with or without an eeeee!

But the Bishop is back!

Jan 12, 2013 at 10:21 PM | Registered Commenterpeterwalsh

The spelling of Whishkey on this blog leaves something to be desired ^.^

Jan 12, 2013 at 10:23 PM | Registered CommenterDung

In climate science what is not spoken of is often more enlightening than what they will discuss. The weakness of, and flaws in, Schmidt's 'rebuttal' is what speaks volumes.

Jan 12, 2013 at 11:21 PM | Unregistered CommenterWill Nitschke

I'm immensely grateful to Schmidt. I made one comment at RC in the dark ages, questioning the apparent leverage of CO2 on temperature, given its logarithmic scaling, and his answer was so smug I went away and continued my own assessment . . .

Jan 12, 2013 at 11:34 PM | Unregistered CommenterCapell

I see Aldrin does try a couple of other priors (1/S uniform, pre-1850 priors from Hegerl) both of which pull back the upper credibility limits from that calculated using a uniform prior. The 1/S uniform has the most dramatic effect (by aging eyeball 90% 2.5 C down from 3.4 C; 95% 3 C to 4.25 C), but presumably errs in the other direction. If Pueyo's suggestion of a log-uniform non-informational prior had been used this would have pulled the distribution up from 1/S, but presumably the upper confidence limits would have ended up lower than Hegerl (90% 3.25 C; 95% 3.6 C), even if they too had used a log-uniform non-informational prior.

This puts aside the question of Aldrin's use of uniform priors for the other parameters.

Jan 13, 2013 at 1:40 AM | Unregistered CommenterHAS

Very interesting. As best I can tell, Gavin has no effective counter-argument to Nic's conclusions. I would encourage you to include Gavin's comments so readers can see this for themselves.

Jan 13, 2013 at 2:51 AM | Unregistered CommenterFrank

To add to what Frank said, if you included Gavin's comments here we wouldn't nead to pay a visit to Sleazyville to see them...

Jan 13, 2013 at 3:00 AM | Unregistered CommenterJimmy Haigh

BTW, I believe James Annan is a subjectivist Bayesian. Subjectivist Bayesians appear to believe the probability is a personal belief. IMO science needs objective estimates, not personal beliefs.

Climate science in a nutshell, thanks Nic.

Jan 13, 2013 at 8:23 AM | Unregistered CommenterAndy scrase

Andy scrase
I hasten to add that I intended no criticism of James Annan, who from my knowledge of and dealings with him seems a decent and competent climate scientist. And subjectivist Bayesians seem to be in the majority camp of statisticians - that approach is much easier to understand and implement than objective Bayesianism. Of course, in many fields the data is good enough that the prior distribution used in Bayesian analyses has little effect. Unfortunately, not in climate science.

Jan 13, 2013 at 8:50 AM | Unregistered CommenterNic Lewis

Nic, I appreciate your response. Somehow I managed to avoid most stats whilst doing my maths degree, but the concept of subjective Bayesian seems to raise my heckles as someone brought up on so called hard science and maths

Is there a reasonable intro to this topic you can point me at?

Jan 13, 2013 at 8:56 AM | Unregistered CommenterAndy scrase

"If Pueyo's suggestion of a log-uniform non-informational prior had been used this would have pulled the distribution up from 1/S,"

Actually, a log-uniform (uniform in log S) prior is identical to a 1/S prior. If you increase S by small amount dS, the change in log_e(S) is dS/S, so working with a log(S) parameterisation the increase in the parameter x the prior is proportional to dS/S. I you retain the original S parameterisation and use a 1/S prior, the change in parameter is dS and multiplying by the prior again gives dS/S. So in either case the posterior probability assigned to the region between S and S+dS is proportional to dS/S x the data likelihood. The likelihood is independent of the parameterisation, only depending on the value of S.

However, Pueyo is wrong in thinking that a log-uniform prior is generally noninformative for S, even in the simple case he considers where S is the only parameter being estiamted. His paper does not put forward a valid objective Batesian approach, IMO. But a 1/S prior may be much closer to being noninformative than is a uniform prior.

Jan 13, 2013 at 9:12 AM | Unregistered CommenterNic Lewis

I can see no justification for a uniform prior distribution at any time, subjective or not. "Uniform Prior" should be a term seen as unacceptable as 2+2=5. As a simplifying assumption, it is not harmless.

If you are going to the trouble of a Bayesian Update, have the decency to use a distribution with vanishing tales within a feasible region. Otherwise, tattoo "charlatan" on your forehead to save everyone time and money.

Jan 13, 2013 at 9:40 AM | Unregistered CommenterStephen Rasey

Jan 12, 2013 at 9:46 PM |Eric H

I've copied this comment to the DT Delingpole blog and put a credit to you. If you wish me to remove the comment I'm happy to so do.


Jan 13, 2013 at 12:35 PM | Registered CommenterGrumpyDenier

Nic, you wrote above "TCR is defined by reference to linearly increasing forcing over a 70 year period, and is not the same for periods significantly shorter or longer than 70 years. So I don't think it is a very good parameter to use."
But if climate response functions, such as figure 3 of Hansen et al. 2011, only achieve ~50% of their equilibrium response in a century, then ECS* is less useful. It seems unlikely to me that our energy technology of even a century hence will remain heavily fossil-fuel-based, even without subsidies to renewable energy. If such is the case, the projections of ever-increasing CO2 emissions, and corresponding atmospheric concentrations, will not be realised; and the equilibrium response will never be achieved. So, shouldn't we be more interested in the mid-term (say, sub-century) response rather than millennial effects which might well remain un-actualised?

*To be clear, ECS=equilibrium climate sensitivity, as opposed to effective climate sensivity which confusingly shares the same abbreviation.

Edit: The site of the link above is temporarily down. The referenced figure is figure 7 in a different (earlier?) version of the paper here.

Jan 13, 2013 at 1:21 PM | Registered CommenterHaroldW

Andy scrase
"Is there a reasonable intro to this topic you can point me at?"

I'd suggest Phil Gregory's book Bayesian logical data analysis for the physical sciences. Probability Theory by Edwin Jaynes (who inspired Gregory) is also excellent, albeit wider ranging. Both are quite long. For a shorter introduction try Sivia' book: Data analysis - A Bayesian tutorial. Bernardo and Smith's 1994 book Bayesian Theory is perhaps most comprehensive, but quite mathematical.

Kass and Wasserman's review paper The Selection of Prior Distributions by Formal Rules (J A Stat Soc, 1996) is well worth reading, albeit a bit mathematical. And Don Fraser's papers are maybe the best at explaining the problems of Bayesian inference with curved parameter-data relationships. Eg. Default priors for Bayesian and frequentist inference (J Roy Stat Soc, 2010).

Jan 13, 2013 at 1:57 PM | Unregistered CommenterNic Lewis

I know that this is going to cause a lot of eye-rolling, but why are clouds and aerosols considered “forcing” in climate? Is the climate something that can exist in a pure form without clouds and aerosols? What other substances are considered “forcing”? Is climate theory based upon an atmosphere of limited ingredients in composition (say, of only nitrogen and oxygen, its two main components on Earth)? If we have all these other factors “forcing” climate change, why are we in such a tither about CO2, which, no matter how the figures are tweaked, is – and is forecast to be, even in the most dire scenario – still but a minor component of the atmosphere (0.05%)?

Jan 13, 2013 at 2:08 PM | Unregistered CommenterRadical Rodent

@Andy scrase

For an easily readable treatment try (available on Kindle) The Theory That Would Not Die (How Bayes' Rule cracked the Enigma Code, hunted down Russian submarines and emerged triumphant from two centuries of controversy) by Sharon Berstch McGrayne.

Jan 13, 2013 at 2:21 PM | Unregistered Commentersimon abingdon

Harold W
"But if climate response functions, such as figure 3 of Hansen et al. 2011, only achieve ~50% of their equilibrium response in a century, then ECS* is less useful."

The 50% figure reflects Hansen's assumption of a high value for ECS (3 C). If ECS is fairly low (1.5 C, say), then (given the modest rate of ocean heat uptake implied by observations) much more of the equilibrium response occurs within a century - more like 85%, with over 80% occurring within 50y.

"*...ECS=equilibrium climate sensitivity, as opposed to effective climate sensivity"

Effective climate sensitivity is the reciprocal of the climate feedback (or response) parameter. It allows for ocean disequilibrium, the main cause of the difference between Equilibrium climate sensitivity and TCR.

Effective and Equilibrium climate sensitivity were treated as near synomymous in the last IPCC WG1 report, with Figure 9.20 having its x-axis labelled Equilibrium climate sensitivity even though almost all the data in in related to Effective climate sensitivity. That seems reasonable. Estimates of Effective climate sensitivity usually reflect sea-ice feedback as well as ocean heat uptake, to a greater or lesser extent depending on the estimation period and method involved . So if one ignores feedbacks with millenial timescales, in most cases Effective climate sensitivity estimates shouldn't be much different from Equilibrium climate estimates.

Jan 13, 2013 at 2:23 PM | Unregistered CommenterNic Lewis

Sigh [rolling eyes] another dumbwit without CGSEs, sigh.
just joking: I do not know either , like most people. So allow me to drivel about a bit.

I think "forcing" is a cause-effect separation of variables in system dynamics sort of thing.

So you cannot force climate, which is a more abstract concept, but you can force "temperature anomaly", which is supposed to be something of an observeable variable. Which it isn't actually, it does not make sense to talk about global temperature, temperature being a local variable. It would make sense to talk about the calorific content of earth, being an energetic quantifiable amount.

So, dynamics, when you write f=m*a, to refer to another obsessive pedantic establishment pundit a few centuries ago, there is supposedly a driving force "f" which "causes" an accelaration to occur.
Unless, of course you are in a situation where for example m changes (say, plasmas under fusion)
Note also accellarations can also occur as a matter of sheer malpertusion stasis as Einstein described in his general relativity. Note also "=" can change if you transit universums as for example Gribbin would describe it.

Anyway i think "they" (pachauri the loco and his cheap shills) want to imply with driver that aerosols drive the temperature anomaly in a sort of way whereby the aerosols can be added into the system as an independent quantity and the system output variable "temperature anomaly" will expose some easily described system dynamical behaviour to it, preferable ODE based.

Jan 13, 2013 at 2:27 PM | Unregistered Commenterptw

Thank you, ptw; don’t worry, my skin is even thicker than I am, so you can’t offend me.

So, in other words, these “forcing” factors are used by the climatologists as an excuse as to why the climate has not acted in the way that they were predicting, even though these “forcing” factors are more-or-less permanently in the atmosphere, and so SHOULD have been part of their reasoning. Or, to put it in a nutshell, they are talking a load of bollocks.

Jan 13, 2013 at 2:54 PM | Unregistered CommenterRadical Rodent

yes they have been doing that for years now.

Because CO2 does not fo