This week some of the world’s most eminent climate scientists have been attending a little get-together in Reading (yes really) with the snappy title of the World Modelling Summit for Climate Prediction. Their aspirations, as described on the official conference website, are quite modest:
The underlying goal of the summit is no less than to prepare a blueprint to launch a revolution in climate prediction.
Now this surely is something that we should all applaud. If one had to pick a single area of climate science that has contributed most towards convincing policy makers and the public that our planet faces a deadly peril from anthropogenic climate change, then the output from models would be a very obvious candidate. It is these vastly complex programs, run on the world’s most powerful supercomputers, which warn of rising temperature trends during the rest of this century.
Observed temperature change over the last hundred-and-fifty years is not really that scary (see here), but predictions from models, lovingly featured in IPCC reports, are definitely not for the faint-hearted. The problem is that these are just predictions; estimates of what may happen if a lot of assumptions about the way that the climate works are right. We will have no way of telling whether they come even close to the truth for decades.
Even among advocates of global warming there is concern that the confidence that is being placed in models is misplaced. In order for them to work at all it is necessary to feed into the computer a complete description of the processes that they are intended to simulate. At present this is impossible as our understanding of how the climate functions is still incomplete. Among other things, both the physics of clouds and the all important carbon cycle still present problems for modellers that they can only solve by a process known as parameterization; in other words guessing what might be happening on the basis of partial understanding of these processes. So it is no surprise to see even the BBC, an organisation that is not usually given to doubts about the evidence for global warming, reporting that:
This week, about 150 of the world’s top climate modellers have converged on Reading for a four day meeting to plan a revolution in climate prediction.
And they have plenty of work to do. So far modellers have failed to narrow the total bands of uncertainties since the first report of the Intergovernmental Panel on Climate Change (IPCC) in 1990.
What then is the breakthrough that will bring about a revolution in climate research? According to Professor Jim Kinter of the Centre for Ocean-Land-Atmosphere Studies in America, climate modellers need computers that are one thousand times more powerful than the most powerful supercomputers that have been developed so far.
But it is by no means clear how a new generation of hardware will lead to more accurate models until improved understanding of the climate makes parameterization unnecessary, and no one is expecting that to happen any time soon.
As another delegate at the conference pointed out, increased computing power would have another advantage. Businesses that are particularly concerned about climate change, and need to plan future investment, are eager for more detailed predictions than are presently available. The CEO of one such organisation, Anglian Water, was interviewed for the BBC Radio4 Today programme on Tuesday (Listen Again, item at 08:51). He said that he is convinced that global warming is inevitable, and that we face changes in precipitation patterns over the next few decades; probably more rain in winter and more frequent summer droughts. What he wants to know - and presumably his company is prepared to pay for this - is just how extreme these changes will be, when they will occur and where they will occur. This would, in his opinion, allow Anglian Water to plan ahead and provide a growing number of customers in the driest part of the UK with a reliable water supply, whatever happens.
Increased computing power will certainly allow more detailed predictions in terms of focusing on particular areas and specific periods in the future in a way that is not possible at present, but until climate systems are fully understood this does not mean that the information will be any more accurate. There is an old adage in computing; GIGO (garbage in, garbage out). If the underlying science that is used in the programs is wrong, the computing processes that it is subjected to will not correct it and the output will be unreliable.
But it would seem that the main theme of this conference really is the need for vastly more powerful hardware on which to run climate predictions. Here the delegates have been fortunate. A quick look at the list of sponsors who have helped make their get-together possible reveals that IBM, Cray and NEC have all, no doubt quite selflessly, lent a hand. In fact IBM and NEC are so enthusiastic about the ‘blueprint to launch a revolution in climate prediction’ that they are providing all the delegates with banquets (their word not mine) on two consecutive evenings. As the world’s leading suppliers of supercomputers, who can blame them?
Of course developing a new generation of computer hardware will be very, very expensive, but if climate scientists say that they must have them in order to save the planet, then perhaps governments will be persuaded to channel funds into the project, and then everyone will be happy. The computer giants because this will open up a new market for them; the climate dependent businesses because the modelling revolution will provide them with high definition predictions that they can use in business plans; the climate scientist who will be able to say that, whatever shortcomings models may have now, a breakthrough is just round the corner if only they can have the tools to do the job; the IPCC and other advocates of global warming who will have even more detailed scenarios to confront sceptical politicians and the public with.
The only question is, will we have any more reason to trust the modeller’s predictions, or will they just be providing more elaborate versions of the wrong answer?
Note: In a day or two I will be posting on some other aspects of this conference.
Update, 12/05/2008: I mentioned this post in a comment at Steve McIntyre’s Climate Audit blog on a thread entitled, Koutsoyiannis 2008 Presentation. This deals with the verification and falsification of model predictions at a highly technical level, but the last paragraph of a comment from Prof. Koutsoyiannis (here) on that thread seems to be particularly relevant to the conclusions reached above.