In a previous post (here) I reported that 150 of the world’s leading climate modellers had gathered in Reading to discuss their need for computers one thousand times more powerful than the present generation of supercomputers. Now I want to look at another couple of aspects of this conference.
Here is an extract from the delegate’s official website under the heading Background:
The reality of climate change has been accepted by the world. Thanks to the sustained, comprehensive and objective assessments by the Intergovernmental Panel on Climate Change (IPCC), a consensus, with a high degree of confidence, has emerged in the scientific community that human activities are contributing to climate change. A systematic program of numerical experimentation with climate models during the past 40 years has played a crucial role in creating this scientific consensus, and in its acceptance by the world.
Now is it just me, or does this sound like someone whistling in the dark to keep their spirits up? Surely if what this paragraph says is true and incontrovertible, then there is no need to restate it to the very people who have brought about this situation? Presumably it is something that they are very well aware of; if there is a confident consensus, then these scientists, of all people, must be part of it. Or is it possible that some of them have not noticed and need reminding?
The Background statement continues:
The nations of the world have therefore begun, with great urgency, discussion about mitigation and adaptation to climate change, the inevitability of which is now beyond doubt. The climate models will, as in the past, play an important, and perhaps central, role in guiding the trillion dollar decisions that the peoples, governments and industries of the world will be making to cope with the consequences of changing climate.
The climate modeling community is therefore faced with a major new challenge: Is the current generation of climate models adequate to provide societies with accurate and reliable predictions of regional climate change, including the statistics of extreme events and high impact weather, which are required for global and local adaptation strategies? It is in this context that the World Climate Research Program (WCRP) and the World Weather Research Programme (WWRP) asked the WCRP Modelling Panel (WMP) and a small group of scientists to review the current state of modelling, and to suggest a strategy for seamless prediction of weather and climate from days to centuries for the benefit of and value to society.
This leaves us in no doubt about the importance of the issues being discussed at Reading; no less than the scientific basis for research that will shape the thinking of politicians and policy makers thereby influencing ‘trillion dollar decisions’. It would seem reasonable to expect that, at this point, there might be some discussion of the confidence that can be attached to the predictions that the models provide. After all, if they influence ‘trillion dollar decisions’ it would be wise to consider whether they are likely to be right. But the modeller’s seem to be concerned with something else; creating models that provide higher definition, not greater reliability.
What then is the distinction between the predictive skill of models and the definition of the predictions that they make?
So far, climate models can be described as having low definition. They make predictions about global temperatures, and associated weather patterns such as increased precipitation or drought, over a long period, typically one hndred years ahead. It is no good asking them what will be happening in Manchester during the late 2060s, or even what average rainfall will be in south-east Africa in ten years time. The present generation of models cannot do this because they do not deliver this level of detail. But from a practical point of view, this is precisely the kind of information that policy makers and business now want. The modellers say that without more powerful computers, they cannot provide this.
In part 1 of this post (here), we saw that the uncertainties surrounding the predictions made by climate models has not been reduced during the last two decades. This has a great deal to do with our less than complete knowledge and understanding of how the immensely complex processes that drive the earth’s climate function. Are the modeller’s really talking about massive investment in a new generation of supercomputers so that they can provide more detailed predictions, but without reducing their level of uncertainty? It would seem that they are.
Here is what Prof. Julia Slingo, Director of Climate Research, Centre for Atmospheric Science at Reading University told the BBC:
“We’ve reached the end of the road of being able to improve models significantly so we can provide the sort of information that policymakers and business require,” she told BBC News.
“In terms of computing power, it’s proving totally inadequate. With climate models we know how to make them much better to provide much more information at the local level… we know how to do that, but we don’t have the computing power to deliver it.”
Professor Slingo said several hundred million pounds of investment were needed.
But what about those fundamental uncertainties? Here is a little more from the same BBC report:
Knowing the unknowns
One trouble is that as some climate uncertainties are resolved, new uncertainties are uncovered.
Some modellers are now warning that feedback mechanisms in the natural environment which either accelerate or mitigate warming may be even more difficult to predict than previously assumed.
Research suggests the feedbacks may be very different on different timescales and in response to different drivers of climate change.
Just last week, preliminary research at the Leibniz Institute of Marine Sciences in Kiel, Germany, suggested that natural variations in sea temperatures will cancel out the decade’s 0.3C global average rise predicted by the IPCC, before emissions start to warm the Earth again after 2015.
IPCC authors said this was not incompatible with their models; but the German research provoked some sceptics to ask whether models could be believed at all.
Of course, if the uncertainties surrounding the models are large enough, then nothing will be incompatible with them. Here is another modeller’s somewhat gnomic response to this problem:
“If we ask models the questions they are capable of answering, they answer them reliably,” counters Professor Jim Kinter from the Center for Ocean-Land-Atmosphere Studies near Washington DC, who is attending the Reading meeting.
“If we ask the questions they’re not capable of answering, we get unreliable answers.”
What does all this mean in practical terms? This is something else that Prof. Slingo told the BBC:
Climate models have been enormously successful in alerting us to the dangers of our activities in emitting greenhouse gases and so forth.
The use of the word ‘successful’, coming from a scientist, is rather strange in this context. Climate models make predictions about what will be happening to our planet far in the future. Therefor it will be many decades before we will know whether they are ‘successful’. All we can be certain of at the moment is that the terrifying output from these models has been very successful in persuading people that anthropogenic global warming is a problem.
This conference might be seen as an esoteric brainstorming exercise that is relevant only to those directly involved in this highly specialised branch of research, but looking at the list of participants we find that this is not the case. Some of the very biggest names in climate science have come to Reading including Rajendra Pachuri, Chairman of the IPCC, Michel Jarraud, Secretary-General of the World Meteorological Organization , and Kevin Trenberth, one of the coordinating lead authors on the IPCC’s Fourth Assessment Report which was published last year. The term summit is not a misnomer.
What conclusions should we draw from what we know about this summit? Probably only that there is very considerable confusion in the world of climate modelling at the moment, and this calls into question the relevance of model predictions when considering ‘trillion dollar decisions’. Climate modelling may not need vastly expensive new hardware, but a period of reflection during which confidence in predictions about future climate can be vigorously re-assessed.