Higher temperatures over 34 years — rather than land-use changes — have led to more blazes, researchers say. They’re sure it’s not afluke.
By Robert Lee Hotz
Rising temperatures throughout the West have stoked an increase in large wildfires over the past 34 years as spring comes earlier, mountain snows melt sooner and forests dry to tinder, scientists reported Thursday.
More than land-use changes or forest management practices, the changing climate was the most important factor driving a four-fold increase in the average number of large wildfires in the Western United States since 1970, the researchersconcluded.
The average spring and summer temperatures were more than 1.5 degrees higher in Western states between 1987 and 2003 than during the previous 17 years. In fact, the seasonal temperatures were the warmest since record-keeping started in 1895, the researchers said.
While the researchers stopped short of linking increased wildfire intensity to global warming caused by rising levels of greenhouse gases, they were confident that they had documented a broad climate trend and not a fluke of natural weather variability.
“It all fits together,” said climate researcher Anthony Westerling, who led the research while at the Scripps Institution of Oceanography in La Jolla. “The [fire] seasons do start earlier and run longer. It is consistent with a changing climate.”
Some scientists were more confident that greenhouse gases from industrial activity, cars and pollution were to blame.
“I think this is the equivalent for the West of what hurricanes are for the Gulf Coast,” said fire ecologist Steven Running at the University of Montana in Missoula, who was not connected with the research. “This is an illustration of a natural disaster that is accelerating in intensity as a result, I feel, of global warming.”
All told, the average fire season has grown more than two months longer, while fires have become more frequent, longer-burning and harder to extinguish. They destroy 6.5 times more land than in the 1970s, the researchers found.
Last year was the worst wildfire season on record, with over 8.53 million acres burned nationwide by the end of December. So far this year, more than 60,000 wildfires have charred almost 3.9 million acres twice the number of fires during the same period last year, according to the National Interagency Fire Center in Boise, Idaho.
“I see this as one of the first big indicators of climate change impacts in the continental United States,” said Thomas W. Swetnam, an expert on fire history and director of the laboratory of tree-ring research at the University of Arizona in Tucson, who was part of the research team.
In the first detailed study of its kind, scientists at Scripps and the University of Arizona analyzed 34 years of wildfire activity, temperature records, snow-melt trends, stream flows and other climate-related data.
The research, published online Thursday by the journal Science, was funded by grants from the National Oceanic and Atmospheric Administration, the U.S. Forest Service and the California Energy Commission.
The researchers studied more than 1,100 large wildfires between 1970 and 2003.
They reported that almost seven times more forested federal land burned between 1987 and 2003 than during the previous 17 years. During the same period, the length of the wildfire season increased by 78 days. The average time between a fire’s discovery and its extinguishment also lengthened from 7.5 days to 37.1 days.
Wildfires cost more than $1 billion a year in federal firefighting expenses, plus untold property damages.
The impact of rising temperatures on wildfires seemed most profound in the forests of the northern Rockies, which accounted for 60% of the blazes from 1987 to 2003, and least pronounced in arid Southern California, the researchers said.
The effort to prevent Western wildfires has been consumed in recent years by controversies over management of federal forests. Forestry managers have attempted to rectify generations of fire suppression, which allowed combustible vegetation to build up to dangerous levels, by thinning forests and logging weak trees.
“The most stark conclusion here is that, while they do say land use and management has played an important role, the broad scale increase in wildfire frequency has been driven primarily by recent changes in climate,” said wildland fire analyst Tom Wordell at the National Interagency Fire Center. “It does not paint a pretty picture for future fire activity, given the climate model predictions.”
Indeed, the finding calls into question the ability of forest management policies to keep pace with what may be an insoluble problem. The Government Accounting Office recently reported that the cost and severity of fires have grown as more people build homes on the edges of national forests and other federal lands.
“It is looking like we will have more wildfires and, given what we can afford, management is not going to prevent an increase in wildfire area,” said research ecologist Don McKenzie at the Department of Agriculture’s Pacific Wildland Fire Sciences Lab in Seattle, who was not connected with the study. “We don’t have the resources to do that.”
If regional temperatures continue to rise, as many computer climate models predict, wildfire activity throughout the West will intensify so long as there is acreage to burn, several experts said.
Moreover, as more forests do burn, the destruction of so much biomass will release massive amounts of carbon dioxide, further accelerating the increase of greenhouse gases in the atmosphere and helping to further increase temperatures.
Western forests account for as much as 40% of all the carbon sequestered in the U.S.
“Lots of people think climate change and the ecological responses are 50 to 100 years away,” said research team member Swetnam.
“But it’s not 50 to 100 years away. It’s happening now in forests through fire.”