Advertisement
It follows earlier reports in the IPCC’s Sixth Assessment round, which reiterate that climate change is unequivocal and ubiquitous, humans are to blame and warming will surpass the Paris target to keep warming below 2 degree Celsius unless we make deep cuts to emissions.
For its projections of future warming, the IPCC relies heavily on climate models – computer simulations that help us understand how the climate has changed in the past and how it is likely to change in the future under various emissions scenarios.
These models are continuously updated but some new-generation models are “running hot”, showing a notably higher climate sensitivity than previous ones.
Related Articles
Advertisement
Estimating climate sensitivity Climate sensitivity describes how much global temperatures will rise in response to human-caused greenhouse gas emissions. The best estimate is 3 degree Celsius of warming for a doubling of pre-industrial carbon dioxide levels, with a likely range of 2.5 to 4 degree Celsius, but ongoing research aims to narrow this range.
Several new models, contributed by renowned modelling centres, display climate sensitivities outside this likely range and larger than any models used for the IPCC’s last assessment in 2013. As a consequence, they simulate anomalously large and fast warming during the 21st century.
Critics see climate models generally as flawed attempts at capturing the complexities of the climate system, not good enough as scientific evidence to guide climate policies.
Yes, all climate models have flaws because they are models, not reality. But they are spectacularly successful at capturing past climate change, including the steady march of global warming and the intensification and increasing frequency of floods and droughts that now regularly make headlines. Nevertheless the large sensitivities of some models are a cause of concern.
The story starts in the early 2000s, when various satellite measurements were combined to better describe the Earth’s radiation budget – the balance between incoming solar radiation and reflected outgoing visible light and invisible infrared radiation.
Based on this, the earlier IPCC report concluded clouds over the Southern Ocean were poorly represented in models, with insufficient sunlight reflected back into space and too much reaching the surface where it warmed the ocean. Later research found many models simulated ice clouds when in fact they should have been liquid clouds.
Simulating water and clouds This may sound like an elementary problem, but it isn’t. If water comes in very small droplets – as it does in clouds – it can remain liquid down to about -35 degree Celsius. We call such droplets supercooled.
If the water contains impurities, its freezing temperature can be anywhere between 0 degree and -35 degree Celsius. Simulating clouds under all conditions is therefore far from trivial.
Modelling groups generally succeeded in introducing more supercooled liquid clouds into their latest models and at least partly solved this Southern Ocean cloud problem. But this change weakened an important climate feedback: as the climate warms, liquid clouds become more prevalent at the expense of ice clouds.
Liquid clouds are brighter and more reflective than ice clouds, and under progressive global warming more and more incoming sunlight is reflected back into space, counteracting the warming effect. However by replacing ice with supercooled liquid clouds, newer models weaken this cooling effect. This is the leading explanation for the larger climate sensitivity of many new-generation climate models.
The IPCC’s response The latest IPCC report didn’t raise the estimate of the planet’s actual climate sensitivity. It cites observational evidence to make the case, including “historical” warming which is very well understood for the past several decades.
Models with a middle-of-the-road climate sensitivity near 3 degree Celsius often better reproduce the temperature variations of this historical period than those with a large climate sensitivity.
Further evidence comes from simulations of the Earth’ geological past (thousands to millions of years ago) which saw both much colder and much warmer climates than at present. Geological evidence shows high-sensitivity models exaggerate the temperature swings of this distant past. By the same token, a few very low-sensitivity models are also unlikely to be correct.
The latest report concludes climate sensitivity is now better understood, but it doesn’t go as far as dismissing high-sensitivity models altogether. Instead, it says such models simulate “high risk, low likelihood” futures that cannot be ruled out.
Refining climate models What does the future hold for climate models? Climate sensitivity is the result of a model’s “tuning” whereby parameters are varied systematically until the model produces an acceptable representation of the well observed climate of the past few decades.
Clearly this process requires refinement. Low, medium, and high-sensitivity models have all passed this test, yet these models project quite different magnitudes of warming for this century.
There is scope for increasing cooperation between institutions, scientific disciplines and countries to rise to this challenge. The latest IPCC report did an excellent job at dealing with this, but clearly better constraining the climate sensitivity in models would further raise confidence in climate projections.
The stakes are high. Climate projections inform expensive and disruptive adaptation and mitigation decisions around the world, including which coastal properties should be abandoned due to rising seas, how quickly we need to wean ourselves off fossil fuels, or how to make agriculture climate resilient and climate neutral while still feeding a growing human population.
Seen against this backdrop, a seemingly innocuous, technical problem in climate modelling takes on outsized importance.
(The Conversation. By Olaf Morgenstern, Principal Scientist — Atmosphere and Climate, National Institute of Water and Atmospheric Research)