The worst-case scenario for climate change just got incomprehensibly worse. Stratocumulus clouds break up if carbon dioxide levels get high enough. Since these clouds cool the planet by reflecting light back to space, interfering with them would amplify global warming beyond anything seriously considered up to this point. Although it is unlikely humans will release enough carbon dioxide to trigger this catastrophe, we're heading for something closer than anyone should feel comfortable about.
Stratocumuli are usually non-rain-bearing clouds widespread in the tropics and subtropics, covering almost 20 percent of the oceans there at any one time. They're powerful reflectors, so if the portion of the globe they cover were to fall significantly, global temperatures would leap.
Indeed, if we lost the world's stratocumulus clouds entirely the planet as a whole would warm by 8ºC (14ºF), Professor Tapio Schneider of the California Institute of Technology reports in Nature Geoscience. Temperatures in subtropical regions would rise by an average of 10ºC (18ºF), but it's hard to imagine any human civilization left to record it precisely. This rise, by the way, is on top of, rather than including, the temperature increases already predicted by climate models.
Schneider has modeled the effect of higher carbon dioxide levels on stratocumulus clouds. He found that above atmospheric concentrations of 1,200 parts per million (ppm), instability appears within the clouds and the atmosphere above becomes more opaque to long wavelength radiation. These break the existing enormous banks of stratocumulus clouds into scattered cumulus puffballs that reflect less than 10 percent as much light.
Moreover, if such a thing were to occur, it would be almost impossible to reverse. Reforming stratocumulus banks requires carbon dioxide levels below today's levels. At intermediary CO2 concentrations, the existing situation is maintained, whatever it may be.
According to Schneider, while the consequences are dramatic, the effect is only detectable when atmospheric models operate at very fine scales, which is why previous global climate models have missed it. This may explain why certain past eras, such as the early Eocene 50 million years ago, were surprisingly hot, given what else we know about conditions at the time.
Before you sell all your worldly wealth and join an apocalyptic death cult, it's important to note that we're a long way from 1,200 ppm, and will probably never get there. Before the Industrial Revolution, CO2 levels were around 270 ppm. They're now at 410 ppm. The Intergovernmental Panel on Climate Change projects levels to be at 600-1,000 ppm by the year 2100 if we refuse to act on climate change, or sub 400 ppm if we get serious.
On the other hand, one has to consider the possibility that Schneider's estimates are actually too optimistic, and stratocumulus collapse could occur at a somewhat lower level. Is that a risk we really want to take?