Wednesday, January 19, 2011

Better late...

As I was promised recently, our (non-)uniform prior paper has officially appeared in dead tree format in the January issue of Climatic Change. Abstract below, and you can find it on my web site. It was waiting for a commentary which doesn't seem to have happened. I'm not sure how I should feel about that: one the one hand, it's a relief that someone doesn't get a free shot at criticising it (not that I know what they were going to say), on the other, perhaps this is the one thing worse than being talked about. I'll be interested to see if the appearance of the paper version will attract more interest (citations), given that it was officially published on-line back in 2009 anyway. It seems that many (most? all?) people have accepted the argument, without explicitly referring to it. Eg Sokolov et al in their 2009 paper only used an expert prior, referring to their previous use of a uniform prior as merely a sensitivity analysis. However, in that previous work, it was actually the uniform prior case that was presented as the main result - and which was featured in the AR4 by the IPCC, who ignored my explicit request that the expert prior results should at least be mentioned therein. I realise our paper was not published when Sokolov et al was written, so I'm not criticising them for not citing us. It will be interesting to see how the IPCC authors manage this particular conundrum this time around.

The equilibrium climate response to anthropogenic forcing has long been one of the dominant, and therefore most intensively studied, uncertainties in predicting future climate change. As a result, many probabilistic estimates of the climate sensitivity (S) have been presented. In recent years, most of them have assigned significant probability to extremely high sensitivity, such as P(S gt 6C) gt 5%. In this paper, we investigate some of the assumptions underlying these estimates. We show that the popular choice of a uniform prior has unacceptable properties and cannot be reasonably considered to generate meaningful and usable results. When instead reasonable assumptions are made, much greater confidence in a moderate value for S is easily justified, with an upper 95% probability limit for S easily shown to lie close to 4°C, and certainly well below 6°C. These results also impact strongly on projected economic losses due to climate change.


Hank Roberts said...

Trial in a Vacuum: Study of Studies Shows Few Citations

“No matter how many randomized clinical trials have been done on a particular topic, about half the clinical trials cite none or only one of them.”

“As cynical as I am about such things, I didn’t realize the situation was this bad,” Dr. Goodman said.

It seems, Dr. Goodman said in an e-mail, that “either everyone thinks their study is really unique (when others thought it wasn’t), or they want to unjustifiably claim originality, or they just don’t know how or want to look.”

James Annan said...

Thanks, that's interesting. I've come across Goodman before...

...and on a more positive note, the first email request for a reprint has just arrived!

Jesús R. said...

I get completely lost with the details, but the conclusion is highly relevant. Hope to see this in the AR5 :)

You say: "with an upper 95% probability limit for S easily shown to lie close to 4°C,"

What would the lower 95% probability limit for S be then? Around 2ºC?

Anonymous said...

If I'm reading it right, the authors' preferred analysis gives this result:

"The resulting 5–95% posterior probability interval is 1.2–3.6 C"


Joshua Stults said...

Nice section heading: "Ignorant priors" vice the more usual "noninformative".

I agree with your criticism that uniform priors over some interval actually contain information, but that doesn't mean your conclusions follow.

I know you are critical of the usual "text book" solution to this problem, but consider this paper: Invariant Bayesian estimation on manifolds, which addresses one of the main concerns I've always had, not sure if it addresses your though.

David B. Benson said...

Jesús R. & Ned --- This simple model
strongly suggests that the transient response so far exceeds 2 K. It is quite unlikely that the equilibrium response will be less than the transient response.

James Annan said...

The particular constraint here - though a particularly convenient one (since based on recent satellite data) - pushes the result a little lower than others might have done. The equivalent Forest et al result based on C20th warming seems to be about 1.9-4.7C (reading off their graph, I didn't find it reported in the paper). I'd certainly be happy to see other attempts at credible estimates - the more the merrier, it's important to show (test) robustness.

David, I'll have to have another look at your results...I'm a bit surprised you seem to get such a high transient result. jstults, thanks, that seems similar to Jeffreys really? We'll see how that turns out in the climate context as people are working on it. I anticipate that all reasonable alternatives to a uniform prior will give broadly similar results anyway...

David B. Benson said...

James Annan --- Oops. The 2+ K is normalized for 2xCO2. The temperature record is what it is and I just simply match it using lnCO2 plus a small correction for AMO.

I only mentioned it as providing easy to understand evidence that the Charney climate sensitivity exceeds 2 K, in agreement with IPCC AR4.

James Annan said...

Yes, I realised it was 2xCO2, but even so it seems a little surprising to see such a high estimate given that you have ignored aerosol forcing (which probably cancels out the GHG to some extent).

Also, your (generally reasonable) suggestion of a 50% uprating of the transient to equilibrium sensitivity isn't really consistent with your use of a 1 decade fixed lag, but I'm quibbling...

Anonymous said...


You wrote "you have ignored aerosol forcing", but you have ignored the albedo forcing. See:

Flanner, M. G., Shell, K. M., Barlage, M., Perovich, D. K. and Tschudi, M. A. (2011) ‘Radiative forcing and albedo feedback from the Northern Hemisphere cryosphere between 1979 and 2008’, Nature Geosci, advance online publication. doi:10.1038/ngeo1062

Cheers, Alastair.

Nosmo said...

A nice simple article in NY Times about misuse of statistics:

David B. Benson said...

James Annan --- I was pleasantly surpris4ed to see how well that zero reservoir model does with a one decade lag. I tried both smidecades and bidecades and neither did anywhere close to as well.

Anonymous said...

Hansen and Sato 2011 use priors that really are prior. Sensitivity is found from the relation of past temperatures to past forcings (both estimated) the latter based in large part on past CO2 measured in ice cores. They find S = 3. However, if past temperatures were higher then S is higher and vice versa and either way we are in dangerous territory with respect to sea level change. Thus if you find lower S this cannot give economists and planners much comfort. Only ecologists get a break.

Pete Dunkelberg

James Annan said...

Nosmo, I like that article. Andrew Gelman has also blogged about that paper.

Alastair, no I haven't :-)

Paul said...

Curry has noticed your paper. Doesn't that warm the cockles of your heart?

Alastair said...

Well James, I find that rather surprising! The word albdeo does not appear anywhere in your paper :-(

jules said...


Tee hee!

Now James gets to re-consider... "...perhaps this is the one thing worse than being talked about"


Deep Climate said...

You are right up there with Schwartz!

Ron Cram:I am a fan of Stephen Schwartz of Brookhaven National Labs. ... In 2010, he also published “Why hasn’t Earth warmed as much as expected?”

curryja: Thanks for the links, i agree these are good papers.

Ron Cram: Good papers and they were published after AR4. Annan and Hargreaves think it is “very likely” sensitivity is less than 4.5C. But Schwartz paper increases the chance sensitivity is below 1.5C. The Schwartz estimate is based on CRU data. If there is a systemic warming bias in CRU data (as Climategate hints there may be), then the Schwartz estimate may be significantly too high.

curryja: [Silence]

You might want to set the record straight on a reasonable 5% lower bound for S.

Pekka Pirilä said...

I would like to have your comment on these observations by Alexander Harvey.

My impression is that he has noticed a serious error in your calculations. If he is right, as I believe, the effect is minuscule in the case that you discuss.

James Annan said...

Thanks for the links. Actually, I already had something on Curry in the works, and will post it shortly. I'd even seen Alexander Harvey's comment and was waiting to see if any reader over there could answer it - or indeed if he would try asking me...

Pekka Pirilä said...

As there was one additional question in Judith's blog, I decided to continue there. My impression is still that your calculations are seriously in error and the quantitative results totally wrong.

On the other hand I agree that choosing the prior may in some cases influence strongly the results. Your example was, however, not one of those cases as far as I can see.


Pekka Pirilä said...

After on badly slept night, I know what I missed in the argument. The full explanation is at

As I state there, I should have seen this immediately. One problem is that I read your paper a couple months ago and checked only some parts of it now from computer screen. For me this seems to increase the risk of missing something essential from the content.

I think that your sentence "A Gaussian likelihood in feedback
space has the inconvenient property that f(O|L = 0) is strictly greater than
zero, and so for all large S, f(O|S) is bounded below by a constant." is not the best way of describing the issues related to the fact that the relationship between S and L breaks at L = 0. It is possible that L < 0, but that does not correspond to S < 0, but to an oscillating or unstable system.

James Annan said...

Ah, glad to see that it was all my fault all along, even though my calculations are in fact correct :-)

Pekka Pirilä said...

I am really embarrassed about the unfounded certainty that I expressed in my postings. I should have known better and I apologize for that.

While I was looking deeper in the problem, the question came out, whether it is more proper to give a prior for L or S. If one accepts the mechanism of feedbacks as the main source for the potential high sensitivity, the conclusion could be that one should consider the strength of the feedback when proposing a proper prior. Alternatively one might wish to restrict the prior by combining arguments related to the feedback to arguments based directly on sensitivity.

Looking at what we can learn from feedbacks, it appears to me quite reasonable to argue that the value L=0 does not have a such a special status that would allow for a prior peaking strongly at values very close to it on the positive side. It would rather be natural to argue that the strength of feedbacks could in absence of all observation be equally well larger than 1 as slightly smaller than 1. This would mean that the first prior should be smooth at L=0 (the strength of the overall feedback is 1-L).

Adding some observations to this first prior, the first thing to add might be the observation that the climate is not strongly unstable. This means that L > 0. This requirement could be included in the prior for further considerations. On this basis we could assume that the prior must have a finite value at L = 0 and behave smoothly near to this point. The value at L = 0 may be 0, but need not be and allowing it to have a non-zero value would imply less information than requiring it to be zero.

A finite value of the pdf of L at L=0 means that the tail of the prior for S must be bound by C/S^2 where C is some constant. This would give additional support for a Cauchy like distribution.

Perhaps all this has been clear to you, but my purpose is to switch from badly founded criticism to constructive dialog.


James Annan said...


I don't think it should matter whether the prior is stated in terms of L or S - the important thing is that it should represent a reasonable belief, but it's possible that one viewpoint will lead more intuitively to such beliefs. The FG choice of uniform in L was really an accident (they weren't really doing a Bayesian analysis, merely presented their confidence interval as a probability interval) and so was never likely to be taken seriously as a Bayesian estimate - indeed they were quite diffident about it themselves.

Even the cauchy-type of prior has a big problem when it comes to conventional economic analyses if you include a sufficiently nonlinear utility a la Weitzman. I view that as probably a problem with conventional economic analyses though :-) I'm not sure how many readers will have realised this. I didn't mention it in the paper as it would have required a significant digression.

I wonder if a few more years of data has changed anything...they only used data from 1985-1996 with a gap for Pinatubo. But I don't know what radiation data are available since then.

Pekka Pirilä said...

I understand that the prior can equally well be expressed in any variable in with a differentiable one-to-one relationship with another.

As the prior represents rational expectations all arguments influencing these expectations should be taken into account and looking at the prior using a particular variable may be helpful in this respect. In my previous message I proposed that the requirement of a finite bound for the prior when expressed as pdf of L is a possible requirement for a prior.

I agree on the sensitivity of the economic results on assumptions related to the possible development of the future damage costs and on how discounting is used in determining the present value of total costs. Several people (including Stern) have made proposals that may lead to very high costs for the higher climate sensitivities. With such estimates 1/S^2 may not be fast enough cutoff for making results finite or not very sensitive on details of the prior. Still accepting that the prior for S must fall of as 1/S^2 or faster for large S would help in reducing the uncertainties.