USA–– Reading about the wildfires burning in northern Minnesota got me curious about how the firefighting year had gone for the U.S. Forest Service nationally.
As I wrote here a couple of months ago,the service changed course this year and quietly abandoned its relatively recent, long overdue and entirely sensible policy of fighting only those wildland fires that threaten human lives, homes or businesses, or seemed likely to grow to uncontrollable size.
Instead, the service went back to putting out pretty much every fire, and as soon as possible. The reason given for this retreat from modern thinking was strictly financial: With firefighting costs predicted to exceed the 2012 budget, top managers gambled that they could make the money go farther by fighting every blaze at the outset rather than risk a few growing into budget-busting proportions.
And did that bet pay off? Not so much, it would appear.
The year’s allocation of something over $500 million for wages, equipment and a contract air force ran out around the end of August, according to a piece in Sundays Washington Post, and the 8 million acres that had burned by that point still set the record for worst fire year on record fully 30 percent above an average year, consuming territory roughly equal in size to the state of Maryland.
And it aint over yet. Used to be the wildfire season started sometime in June and faded away in September; now it starts in May and runs through October. A 5-million-acre year used to be rare, according to the Post; now it’s becoming almost ordinary.
The good news, from the Forest Services point of view, is that Congress gave it another $400 million in the continuing resolution approved at the end of last month. Whether thats good news for the taxpayers, or for the public lands that belong to them, is a matter of opinion, I guess.
And where we go from here is anybodys guess.
Cutting prevention to pay for suppression
The obvious problem of budgeting for wildland firefighting is that the costs can vary dramatically from year to year, as a result of factors that can’t be predicted with much reliability. Like drought. And lightning.
In the good old days, I’ve been told, Forest Service managers followed an unwritten 10 a.m. rule and spent pretty much whatever it took to knock down all fires by midmorning of the day after spotters called them in. If they had money left over come September, it carried over to the next year; if there was a deficit, Congress wrote a check.
More recently, as the Post piece explains, the Forest Service gets an annual appropriation based on a 10-year rolling average of its firefighting costs. And while that might make sense if fire patterns were stable, it has proved inadequate at a time when climate change is causing longer periods of dryness and drought, giving fires more fuel to burn and resulting in longer wildfire seasons.
From 2002 through 2008, according to the Post, the forest service covered shortfalls in the firefighting account by transferring funds from other programs to the tune of $2.2 billion for that seven-year period and Congress restored only a fraction of that amount.
Perversely, the programs that were raided to pay for more firefighting were the very programs aimed at reducing the risk of big, expensive wildfires through such preventive measures as brush reduction and selective thinning of overgrown forests.
Perhaps it’s only coincidence that this period falls within the presidency of George W. Bush, whose administration emphasized timber production over all other priorities for the Forest Service.
As of 2010, the Post says, the service was given a new revolving account called FLAME to bank excess money in years of lower firefighting costs and release it in years of greater need:
Congress allocated $415 million for FLAMEs first fiscal year, 2010 a mild fire season, it turned out. As luck would have it, the following season also presented fewer fires, and a small budget surplus went into FLAME.
But in 2011, Congress went right in after it, taking at least $200 million from the fund and placing into the general treasury to use for other expenditures.
Climate change and fire patterns
Fires that make headlines in Minnesota are usually of a size that might be taken as trivial in the West. But there are exceptions, and we are not immune to the climate trends that are worsening wildfire scenarios everywhere, even if we have the occasional good fortune of a smothering snowfall in October.
A good MPR story a few weeks ago tied last years gigantic, $22 million Pagami Creek fire at the edge of the boundary waters to these changes:
Pagami Creek is just one example of the Forest Service struggling to respond to a new era of larger, more destructive forest fires. Nationally the six biggest fire seasons since 1960 have all occurred in the past nine years.
The likelihood for such fires is growing, said Tim Sexton, director of the forest service’s Wildland Fire Decision Support Center and research development applications program.
“There are a number of forests that are expected to have even more severe fire seasons this coming decade because of some of the climate change factors that are coming into play, “he said. “The Superior is one of those forests that likely will experience larger, more devastating fires in the coming decade.”
For a deeper look at the picture in the West, I commend an analysis by Climate Central of Forest Service data stretching back four decades. Some example findings:
On average, wildfires [now] burn twice as much land area each year as they did 40 years ago.
Compared to the average year in the 1970s, in the past decade there were seven times more fires greater than 10,000 acres each year, and nearly five times more fires larger than 25,000 acres each year.
The burn season is two and a half months longer than in the 1970s. Across the West, the first wildfires of the year are starting earlier and the last fires of the year are starting later, making typical fire years 75 days longer now than they were 40 years ago.