Get full access to Outside Learn, our online education hub featuring in-depth fitness, nutrition, and adventure courses and more than 2,000 instructional videos when you sign up for Outside+ Sign up for Outside+ today.
In a chicken-v.-the-egg-style carbon quandary, scientists have been debating the carbon costs of managing forest fires for some time. It’s complicated: Forest fires belch carbon into the atmosphere, but the new growth that springs up behind fires captures carbon. Is the new growth capturing enough of the fire-emitted carbon? Should we let the fires burn because more trees equal more carbon absorption? Does Al Gore have a presentation that makes this comprehensible?
In a new study that compared surveys of California’s forests from the 1930s to 1990s surveys in similar areas, researchers at UC-Irvine found that while the overall number of trees increased in these forests, the total vegetation (biomass, in carbon-sink parlance) decreased 26 percent in the ensuing 60 years.
The reason? Fire-suppression efforts have increased the number of small trees but weakened large trees–which are huge carbon absorbers. One of the study’s authors put it best: For every large tree a forest loses, 50 small trees are required to absorb the same amount of carbon.
The kicker: These forests now store only one-third the amount of carbon they did in the ’30s and actually release carbon, too.
Should we let them burn? Or wait for the Burning Man to move to California?
In other burning news, a Navy F/A-18 Super Hornet dropped a 500-pound bomb a mile off course and started a fire in Florida’s Ocala National Forest. The ensuing spark burned 257 acres and increased the jet’s carbon footprint to an amount rivaling the Olympic Torch.
— Jenn Fields