(UPDATE 16/1/2011 for those who think I lost the bet, have a look here first.)
So, the program went out on Monday (and can be listened to on-line for those who missed it) and IMO they covered the issue very well. In fact the whole program was interesting - I'm a bit embarrassed not to have discovered it earlier. The presenter seems to do a lot of interesting things - he also has his own webpage/blog.
David Whitehouse stated very clearly on the program that he would bet £100 against a new record by 2011, although I've not yet had a reply to the email I sent on Monday. If any reader knows him directly, I'm be grateful for some contact. In the comments to my earlier post, Chris Randles seemed a bit sceptical of my justification for why I think the bet is a good one (and why I think 30% per year is a reasonable estimate for the probability of a new record in 2009-2011), so here's a bit more background. Actually 30% was partly an inverse calculation, "what would the probability have to be to make the bet an attractive one", but I don't think it is overly optimistic, for the following reasons.
First, here is a bit of Fig 10.4 from the recent IPCC report. The left hand half consists of the last 20y of the models' historical 20th century simulations, and the right half is the first 20y of projections under various scenarios. The yellow/orange future is for fixed atmospheric constituents (at year 2000 values) so is not at all realistic, but the other scenarios all show a near-linear response on average.
I drew on the green line by hand, but it has the right slope for the historical trend. The point here is that for all models under all plausible scenarios, the climate response is very close to linear plus natural variability "noise" over this sort of time scale. In fact the models tend to show a modest increase in trend, although I personally would not be too surprised if this does not actually transpire. But over the next 5 years that is not an issue anyway.
This linear response is a well-known property of the system, basically due to thermal inertia and the steady ongoing increase in CO2. The only real question is, what is the true underlying trend?
Now here's a modified version of the graph I showed last time.
Here, all the blue lines are the even-year trends from 8 to 40 years duration, extrapolated out to 2011 (I just chose even numbers to limit the clutter). Most of these exceed the 1998 record before 2011, some by quite a lot. I've also put dots on the 30y trend line (a rather arbitrary choice, but actually one of the lower trends) just to show that it crosses the 1998 record between 2010 and 2011. Some important details: the residuals have negligible autocorrelation, so it is reasonable to consider each year's anomaly (compared to the trend) as independent. Also, the RMS of the residuals is about 0.08, and the 1998 value is 2.5 standard deviations (a 1% event, if we assume Gaussian residuals, which I would not hang my hat on but it's a reasonable starting point). None of the cold anomalies are that extreme, not even those due to the Pinatubo and El Chichon eruptions. In fact the cold anomaly in 1996 is greater than either of those, with no volcano to explain it. The 2007 cold anomaly is also an entirely unremarkable 0.8 standard deviations, and all 5 years from 2001-2005 were above the trend line. So there is absolutely no evidence of any strange cooling in the recent record - it's just that 1998 was abnormally warm. Using a conservative "underlying" temperature anomaly of 0.5 (on that scale) over the interval 2009-2011, we only need a further 0.026 due to natural variability - about 0.3 standard deviations, a 40% event - to break the old record. So that makes my 30% guesstimate seem a touch pessimistic. Of course the historical trend may be somewhat "contaminated" by some natural variability or other forcings. But this may work either way, it arguably increases the uncertainty a bit but should not bias the results significantly (in fact an increase in uncertainty for the trend in future temperatures would make the odds closer to 50% for each year, which is better for me).
Finally, this is from Smith et al which I talked about earlier. I've added on a couple of lines as guides.
The green one is just the trend, extrapolated (again positioned by hand but the slope is right). This plot uses seasonal anomalies so the link to annual average is not perfect, but it should be good enough. Their own model forecast (white line with red shading) exceeds the old record from 2008 onwards, with high confidence. In fact their paper contains the explicit statement that "at least half of the years after 2009 are predicted to be warmer than 1998, the warmest year currently on record." Actually on re-reading it that phrase is slightly ambiguous, I suspect they mean that each year from 2009 will beat the old record with probability 50% or more (obviously if you run out far enough into the future then the vast majority of years will break the old record). So according to that statement, I've got a 1-0.53 = 87.5% chance of winning.
So, the program went out on Monday (and can be listened to on-line for those who missed it) and IMO they covered the issue very well. In fact the whole program was interesting - I'm a bit embarrassed not to have discovered it earlier. The presenter seems to do a lot of interesting things - he also has his own webpage/blog.
David Whitehouse stated very clearly on the program that he would bet £100 against a new record by 2011, although I've not yet had a reply to the email I sent on Monday. If any reader knows him directly, I'm be grateful for some contact. In the comments to my earlier post, Chris Randles seemed a bit sceptical of my justification for why I think the bet is a good one (and why I think 30% per year is a reasonable estimate for the probability of a new record in 2009-2011), so here's a bit more background. Actually 30% was partly an inverse calculation, "what would the probability have to be to make the bet an attractive one", but I don't think it is overly optimistic, for the following reasons.
First, here is a bit of Fig 10.4 from the recent IPCC report. The left hand half consists of the last 20y of the models' historical 20th century simulations, and the right half is the first 20y of projections under various scenarios. The yellow/orange future is for fixed atmospheric constituents (at year 2000 values) so is not at all realistic, but the other scenarios all show a near-linear response on average.
I drew on the green line by hand, but it has the right slope for the historical trend. The point here is that for all models under all plausible scenarios, the climate response is very close to linear plus natural variability "noise" over this sort of time scale. In fact the models tend to show a modest increase in trend, although I personally would not be too surprised if this does not actually transpire. But over the next 5 years that is not an issue anyway.
This linear response is a well-known property of the system, basically due to thermal inertia and the steady ongoing increase in CO2. The only real question is, what is the true underlying trend?
Now here's a modified version of the graph I showed last time.
Here, all the blue lines are the even-year trends from 8 to 40 years duration, extrapolated out to 2011 (I just chose even numbers to limit the clutter). Most of these exceed the 1998 record before 2011, some by quite a lot. I've also put dots on the 30y trend line (a rather arbitrary choice, but actually one of the lower trends) just to show that it crosses the 1998 record between 2010 and 2011. Some important details: the residuals have negligible autocorrelation, so it is reasonable to consider each year's anomaly (compared to the trend) as independent. Also, the RMS of the residuals is about 0.08, and the 1998 value is 2.5 standard deviations (a 1% event, if we assume Gaussian residuals, which I would not hang my hat on but it's a reasonable starting point). None of the cold anomalies are that extreme, not even those due to the Pinatubo and El Chichon eruptions. In fact the cold anomaly in 1996 is greater than either of those, with no volcano to explain it. The 2007 cold anomaly is also an entirely unremarkable 0.8 standard deviations, and all 5 years from 2001-2005 were above the trend line. So there is absolutely no evidence of any strange cooling in the recent record - it's just that 1998 was abnormally warm. Using a conservative "underlying" temperature anomaly of 0.5 (on that scale) over the interval 2009-2011, we only need a further 0.026 due to natural variability - about 0.3 standard deviations, a 40% event - to break the old record. So that makes my 30% guesstimate seem a touch pessimistic. Of course the historical trend may be somewhat "contaminated" by some natural variability or other forcings. But this may work either way, it arguably increases the uncertainty a bit but should not bias the results significantly (in fact an increase in uncertainty for the trend in future temperatures would make the odds closer to 50% for each year, which is better for me).
Finally, this is from Smith et al which I talked about earlier. I've added on a couple of lines as guides.
The green one is just the trend, extrapolated (again positioned by hand but the slope is right). This plot uses seasonal anomalies so the link to annual average is not perfect, but it should be good enough. Their own model forecast (white line with red shading) exceeds the old record from 2008 onwards, with high confidence. In fact their paper contains the explicit statement that "at least half of the years after 2009 are predicted to be warmer than 1998, the warmest year currently on record." Actually on re-reading it that phrase is slightly ambiguous, I suspect they mean that each year from 2009 will beat the old record with probability 50% or more (obviously if you run out far enough into the future then the vast majority of years will break the old record). So according to that statement, I've got a 1-0.53 = 87.5% chance of winning.