RealClimate has an interesting post up today explaining how climatologists can say with some certainty that the observed increase in carbon dioxide concentration in the atmosphere comes from human activity. The IPCC report goes into detail about many of the lines of reasoning, but RealClimate adds another scientific argument. The article is a bit technical -- enough so that the author has noted that it needs a rewrite -- but makes sense if you follow the reasoning. Let me break it down:
Carbon atoms come in three different isotopes (types based on the different numbers of neutrons in the nucleus): carbon-12 (referred to by chemists as 12C); carbon-13 (13C); and carbon-14 (14C), best known for its use in archeological dating. The proportion of these three types is well-studied, in large part because of the radiocarbon dating work. Historically, carbon-12 makes up the vast majority of carbon atoms, carbon-13 makes up just 1.11%, and carbon-14 atoms are just 1 in 1 trillion among the carbon atoms out there.
An important fact to keep in mind: in fossil fuels, there are fewer carbon-13 atoms relative to carbon-12 atoms than in the atmosphere. This is because carbon-13 weighs just a tiny amount (one neutron's worth) more than carbon-12 and, over time, some physical processes can filter out the different isotopes.
Research attempting to improve the accuracy of radiocarbon dating has come up with a detailed record of variations in the proportionate levels of carbon over the last 10,000 years. At no point in the last 10,000 years has the relative proportion of 13C in the atmosphere been as low as it is now. Furthermore, the ratio of 13C to 12C starts to decrease (as measured in tree ring data, ice core data, and coral data) at the exact same time that the concentration of atmospheric carbon dioxide starts to rise, around 1850. The total change in proportion is about 0.15%, a seemingly-small number, but one which is huge in terms of isotope variation in nature. The last glacial-to-interglacial change in the ice core records, which took many thousands of years, saw only a 0.03% change. Labs can measure variations in 13C to 12C as low as 0.005%.
In short, then:
The most reasonable explanation is therefore that the increase in atmospheric carbon came primarily from the increased use of fossil fuels.
This is a nice post.
But a somewhat simpler argument also demonstrates that the rising CO2 concentrations are due to human activities: fossil fuel carbon is basically devoid of 14C. 14C, or "radiocarbon", radioactively decays (with a half-life of about 5700 years) and is essentially absent in 200-300 million year old oil and coal.
And guess what? Measurements of carbon isotopes in the atmosphere show that a big share of the rising CO2 levels are devoid of 14C. So these emissions are from *old* carbon. Really, really old. Can anyone say *fossil* fuels?
Another nail in the coffin.
But remember that not all of the rising CO2 is due to burning fossil fuels. A lot (about 1-2 billion tons of carbon per year) the emission are also due to land use practices, such as deforestation and land degradation. Compared to the ~6 billion tons of carbon burned in fossil fuels each year, it is still relatively small. But land use used to be a bigger part of the carbon emissions into the atmosphere, and was actually larger than the fossil fuel emissions until the 1950s.
The 13C argument is reasonable too, but I find the 14C to be convincing as well.
Thank you, Jon.
I think the argument for the 13C approach is that since the proportion in the atmosphere is greater, the measurement of difference is more certain. But you're right -- the 14C argument works well, too!
Yes -- Good point. 13C is much more abundant than 14C, although it's now possible to measure both very accurately with accelerator mass spectrometers.