Recent research by scientists studying climate change has found that the world's carbon budget is about 15% too low. That means, to reach the generally agreed upon target of keeping global average temperatures from rising 2°C (compared to temperatures pre-industrial revolution) by the end of the century, carbon dioxide emissions will have to be cumulatively cut by 15% more than what was first estimated.
The "carbon budget" is the tolerable amount of carbon dioxide and other greenhouse gases that can be released into the atmosphere and keep rising temperatures within the stipulated two-degree mark.
This 15% figure is based on the latest study that was conducted by Patrick Brown and Ken Caldeira of the Carnegie Institution for Science in Stanford, California. Brown and Caldeira examined high-powered climate change simulations and found that the ones that best represent current conditions in climate data, predicted higher levels of warming than what has been generally accepted, The Washington Post (WP) reported.
Scientists studying climate normally take a number of predictive models and combine them. The results that each of them gives out provide a range of which the planet is expected to warm up at a given level of CO2. This is how United Nations' Intergovernmental Panel on Climate Change conducts their climate change research, notes the report.
Brown and Caldeira, however, adopted a more novel approach by combining dozens of models. They made use of satellite observations of the atmosphere and compared the effectiveness of each of the models and their accuracy.
The data they used gave weight to the balance of incoming and outgoing radiation — what actually determines how much warmer the Earth will get over time. Researchers called this the planet's "energy imbalance".
"We know enough about the climate system that it doesn't necessarily make sense to throw all the models in a pool and say, we're blind to which models might be good and which might be bad," said Brown.
The best models, according to researchers, are ones that most effectively capture Earth's energy imbalance. These have also been found to be the ones that predict more warming, faster than expected.
Current forecasts, using the UN method, give an average warming of 4.3°C with a margin for error of plus or minus 0.7°C for the time period extending from 2081 to 2100.
The best models approach used for this study predicts 4.8 °C with a margin of plus or minus 0.4°C for the same time period. While that might not seem like too much of a difference, the researchers point out that it is a 15% change.
Michael Winton of the Geophysical Fluid Dynamics Laboratory of the National Oceanic and Atmospheric Administration (NOAA), who was not involved in the study, told WP that this difference in results between the traditional methods and the new findings can be blamed on the clouds in the sky.
Called "parameterisation", this is a statistical error where small effects are not accounted for and not fully captured in large inclusive models because of the scale in which certain events occur.
Clouds, for example, play an important role in reflecting radiation away from the Earth because of their light surfaces. If clouds change, notes the report, so will the climate. "So what you're looking at is, the behaviour of what I would say is the weak link in the model," Winton said.
Brown and Caldeira's study overcame this weak link and made use of models that best represent the effects of clouds.
The report notes that the world, in general, is considered to be off the mark when it comes to controlling carbon. And with this study saying that carbon budget is further cut by 15%, the challenge before our policymakers is enormous.