Several years
ago, when we first got involved in paleoclimate research and were
thinking about using the Last Glacial Maximum to constrain models, jules asked a respected researcher for an estimate of how cold
the LGM was, in terms of a global annual mean temperature anomaly
relative to the modern pre-industrial climate. Their surprising response
was, "I haven't a clue". Of course this wasn't quite true, as the
person in question was surely confident that the LGM was at least a
degree colder overall than the preindustrial state, but not as much as
(say) 20C colder. But they weren't prepared to specify a global average
value, because the available proxy data were few and far between, and
could not be considered to give good global coverage.
When
we looked at the literature, we found lots of analyses which looked at
proxy data on a regional basis (such as SST in the tropics, and various
polar ice cores) but not much relating to a global average value. Some
people (including Jim Hansen) have given rough estimates of about 5-6C, and we
used similar arguments to give an estimate of 6C when we needed one for
this paper.
But it was a bit hand-wavy. As for modelling results, in 2007, the IPCC
AR4 gave an estimate of 4-7C colder than pre-industrial, which was
based more-or-less directly on GCM simulations from the PMIP2 project.
Around the same time, various people (including us here) started more formally combining models with proxy data, by running model ensembles with different parameter values, and came up with similar estimates (e.g. here and here). So, a consensus view of around 5-6C as a best estimate seemed well established.
Since then, several major compilations of proxy data have been published (most notably, MARGO for the ocean, and Bartlein et al
for land), and although the MARGO authors did not generate a global
mean estimate, there were strong hints that their data set was a bit
warmer than the previously prevailing interpretation of ocean proxy
data, especially in the tropics. When Andreas Schmittner and colleagues
published their Science paper last year, their climate
sensitivity estimate made the headlines, but it was actually their LGM
reconstruction that was more immediately eye-catching to us. They fitted
a simple(ish) climate model to these most recent and comprehensive
proxy syntheses, and came up with a global mean temperature anomaly of
only 3C (with an uncertainty range of 1.7-3.7C), which is far milder
than most previous estimates. Irrespective of the resulting sensitivity
estimate (which we'll return to later), such a warm LGM would be hard to
reconcile with GCM simulations. Therefore, we thought it would be worth
considering the LGM temperature field as a stand-alone problem. One
obvious weakness of Schmittner et al's paper is that due to
computational limitations, they only used a fairly simple climate model,
which didn't seem to fit the data all that well.
Our main idea was to see if we could do a better job using the state of
the art GCMs. That requires a rather different approach, as we can't
run large ensembles of these models.
Our resulting paper is still under review at CP, and available here. It has had useful and positive reviews, and I'm now revising it, but I don't expect the overall answer (which is an LGM cooling of 4.0+-0.8C) to change.
Our main result is based on a multi-model
pattern scaling approach, based on a method I'd seen in the numerical
weather prediction literature. It's actually just a simple multiple
linear regression. The idea is that each model (from the PMIP2 database)
is assigned a scaling factor in order that their weighted sum
optimally matches the data (and hopefully, interpolates meaningfully
between these points). This gave substantially better results than
either of the other two methods (which I'll describe below for the
interested reader), and also worked well for a number of validation
tests. The resulting reconstruction fits the data rather better than the results that
Schmittner et al reported. The resulting estimated temperature anomaly
is a nice round 4.0+-0.8C, with this uncertainty representing a 95%
confidence interval. So that is a bit colder than Schmittner et al
estimated, but also substantially warmer than most previous estimates.
It's towards the low end of the range of GCM results, though with
significant overlap. Our formal uncertainty range doesn't take account
of the possibility that someone might come along in a few years and
decide that these proxy data are all misinterpreted or hugely biased. For
example, some people argue that the MARGO ocean data are too warm,
especially in the tropics. We don't have a particular position on that,
but intend to report some more sensitivity analyses in the final
paper. Our analysis is just based on
the need to interpolate into data voids, and the fact that different
models suggest somewhat different interpolations.
And here is what you have all been waiting for - pictures of our reconstruction of the LGM anomaly, which are probably worth more than the 1000 words of this blog post. First, the surface air temperature anomaly, and then the sea surface temperature anomaly. These match closely over the open ocean, but at high latitudes the layer of sea ice insulates the ocean and allows them to diverge.
The dots are the proxy data points. Overall, the main features are very much as expected. Some of the smaller details are probably just noise, like the apparent "hole" in the top plot in the Southern Ocean at around 70S 30W, and also the N Pacific. The method generates uncertainty estimates which are particularly large in these regions. The slight warming in the Arctic SST (under thick sea ice) is in accordance with the proxy
reconstructions, that in the south is more speculative as there are no data there
What this means for climate sensitivity, is left for a future post (some readers may be ahead of me at this point)...
Appendix: For anyone who is interested, here's a quick summary of two other methods we investigated. The paper has a fuller description.
We
first tested the simplest method we could think of: direct smoothing of
the data, as is commonly performed for modern temperature data by the
likes of GISS and NCDC. However, the paleoclimate data are much more
sparse and noisy than modern data, and the results looked pretty messy.
Worse, when we tried sampling "pseudoproxy data" from GCM results (i.e.,
taking data from them in the same locations as the real proxy data) and
smoothing into fields to see how well we could recover the whole field
of GCM output, the results from this process had a huge overall bias.
The basic reason for this is that there are no proxy data available in
the areas which were covered by huge ice sheets back then but which are
now bare (particularly North America and Scandinavia), and it is these
regions which exhibit the largest anomalies for the LGM. So although we
could smooth the real proxy data and obtain an estimated global mean
temperature anomaly of 3.2C, we also knew this could easily be biased by
a degree (or more, or less) compared to the real temperature.
Bias-correcting this (based on the results
generated by the pseudo-proxy experiments) gives an estimate of
4.1+-0.9C cooling. An ugly feature of the smoothing is that the spatial
pattern generated is completely unreasonable, not showing any of the
extreme cooling that must have occurred (simply due to direct albedo and
altitude effects) over the large ice sheets that were in existence
then.
So, we quickly concluded that we needed to use a model of some sort to
interpolate between the data points. As an alternative to running an
ensemble of simple climate models, we decided to use the state of the
art PMIP2 model results which were generated by essentially the same
full complexity GCMs which also contributed to the IPCC AR4 (CMIP3
project). We obviously
weren't able to re-run ensembles of these models with different
parameter values, so instead, we just used a simple pattern-scaling
approach to fit them to the data. We think that this is pretty much
comparable to changing the sensitivity parameter of a simple model -
opinions seem to differ on to what extent this is true, but anyway it's
all we could reasonably do with these models.
The results were markedly better than we got from the smoothing, but still rather disappointing. One minor curiosity, which we had not anticipated but which
is obvious with hindsight, is that this pattern-scaling approach tends
to lead to an underestimate of the overall temperature change. This is
essentially due to regression attenuation which interested readers can
google for more info (or look at our manuscript). Anyway, after bias correction this gives an overall estimate of about 4.5C of cooling, but the uncertainty is actually a bit higher than for smoothing, because different model patterns interpolate into the data voids differently.
So these two methods generate results which are compatible with our main result, but which are somewhat inferior.