The textbook understanding of global chemical weathering—in which rocks dissolve, wash down rivers, and eventually end up on the ocean floor to begin the process again—does not depend on Earth’s temperature in the way that geologists had believed, a new study suggests.
The study, published in Nature Communications, looks at a key aspect of carbon cycling, the process by which carbon atoms move among the air, rocks, and the oceans. The results call into question the role of rocks in setting our planet’s temperature over long timescales.
“Understanding how the Earth transitioned from a hothouse climate in the age of the dinosaurs to today could help us better understand long-term consequences of future climate change,” says corresponding author Joshua Krissansen-Totton, a doctoral student in earth and space sciences at the University of Washington.
The textbook theory
The current understanding is that Earth’s climate is controlled over periods of millions of years by a natural thermostat related to the weathering of rocks.
“It’s a sort of long-term thermostat that protects the Earth from getting too warm or too cold.”
Volcanoes release carbon dioxide into the air, and this gas may then dissolve into rainwater and react with silicon-rich continental rocks, causing chemical weathering of the rocks. This dissolved carbon then flows down rivers into the ocean, where it ultimately gets locked up in carbon-containing limestone on the seafloor.
As a potent greenhouse gas, atmospheric carbon dioxide also traps heat from the sun. And a warmer Earth increases the rate of chemical weathering both by causing more rainfall and by speeding up the chemical reactions between rainwater and rock. Over time, reducing the amount of carbon dioxide in the air by this method cools the planet, eventually returning the climate to more moderate temperatures—or so goes the textbook picture.
“The general idea has been that if more carbon dioxide is released, the rate of weathering increases, and carbon dioxide levels and temperature are moderated,” coauthor David Catling, professor of earth and space sciences. “It’s a sort of long-term thermostat that protects the Earth from getting too warm or too cold.”
A weaker link
The new study began when researchers set out to determine conditions during the earliest life on Earth, some 3.5 billion to 4 billion years ago. They first tested their ideas on what they believed to be a fairly well-understood time period: the past 100 million years, when rock and fossil records of temperatures, carbon dioxide levels, and other environmental variables exist.
Earth’s climate 100 million years ago was very different from today. During the mid-Cretaceous, the poles were 20 to 40 degrees Celsius warmer than the present. Carbon dioxide in the air was more than double today’s concentrations. Seas were 100 meters (330 feet) higher, and dinosaurs roamed near the ice-free poles.
The researchers created a computer simulation of the flows of carbon required to match all the geologic records, thus reproducing the dramatic transition from the warm mid-Cretaceous times to today.
“We found that to be able to explain all the data—temperature, CO2, ocean chemistry, everything—the dependence of chemical weathering on temperature has to be a lot weaker than was commonly assumed,” Krissansen-Totton says. “You also need to have something else changing weathering rates that has nothing to do with temperature.”
Is faster rock weathering an answer for CO2?
Geologists had previously estimated that a temperature increase of 7 C would double the rate of chemical weathering. But the new results show that more than three times that temperature jump, or 24 C, is required to double the rate at which rock is washed away.
“It’s just a much less efficient thermostat,” Krissansen-Totton says.
Steeper surfaces
The authors suggest that another mechanism controlling the rate of weathering may be how much land is exposed above sea level and the steepness of Earth’s surface. When the Tibetan Plateau was formed some 50 million year ago, the steeper surfaces may have increased the global rate of chemical weathering, drawing down more CO2 and bringing the climate down to today’s more moderate temperatures.
In climate fight, carbon removal is still a risky path
“In retrospect, our results make a lot of sense,” Catling says. “Rocks tell us that Earth has had large swings in temperature over geological history, so Earth’s natural thermostat can’t be a very tight one.”
Their calculations also indicate a stronger relationship between atmospheric CO2 and temperature, known as climate sensitivity. Doubling CO2 in the atmosphere eventually triggered an increase of 5 or 6 degrees Celsius in global temperatures, which is about twice the typical projections for temperature change over centuries for a similar doubling of CO2 due to human emissions.
Though not the final word, researchers say, these numbers are bad news for today’s climate shifts.
“What all this means is that in the very long term, our distant descendants can expect more warming for far longer if carbon dioxide levels and temperatures continue to rise,” Catling says.
The researchers will now apply their calculations to other periods of the geologic past.
“This is going to have implications for the carbon cycles for other times in Earth’s history and into its future, and potentially for other rocky planets beyond the solar system,” Krissansen-Totton says.
NASA funded the work.
Source: University of Washington
The post Rocks might not be such a good ‘thermostat’ for Earth appeared first on Futurity.