I continuously see articles in the media about “good fire, ” defined as frequent and low–severity. In other words, such fire seldom kills mature trees.
These fires, we are told, mimic “historical” conditions, creating “healthy” ecosystems by clearing away fuels without killing mature trees. A “good fire” by happy coincidence reduces high-severity blazes or so we are told.
“Bad fires” are those that char extensive acreage. At least some blazes burn at high severity, meaning they kill the mature trees and, by inference, were not a historical component of forest ecosystems.
When I was first attending college, the idea that any fire was “good” was an anomaly, and the scientists who advocated for fire on the landscape could be counted on one or two hands. Since then, most scientists, agency folks and conservationists have adopted the idea that at least some fire-if not too hot, too expansive, and too “destructive” is good. Hence, the good fire-bad fire dichotomy.
A recent example of this perspective (and I could present dozens of examples of this narrative) is the following:
“Also known as “prescribed burns” or “cultural fires,” these intentionally started, controlled fires have received growing attention in recent years as a way to clear dry undergrowth that can build up and fuel huge wildfires. One 2024 review suggests that prescribed burning can reduce the severity of wildfires between 62% and 72% relative to untreated areas. However, the idea of prescribed burns is not new. Indigenous communities have long used “cultural fires” to remove old grass and support new growth. In addition, controlled fires may also help create a healthy ecosystem for cranes and other species by creating open habitats.”
Nearly everything in the above paragraph is false or exaggerated, lacking context.
For instance, few fires actually encounter a prescribed burn when they might effectively influence fire spread and severity. Some estimates put fire encounters with burned areas at less than 1%. Whether these fires reduce fire severity can only occur if fires encounter such burned landscapes.
The qualifier about “when burns are effective” is critical. Within a few years, sometimes as short as two years, vegetation regrowth can negatively affect any fuel reduction. In some cases, a prescribed fire can stimulate the growth of “fine fuels” like grasses, which are more flammable than the vegetation that existed on a site before a burn.
Most ecosystems were naturally characterized by long fire rotations, so burning frequently does not emulate any natural or historical fire condition and is entirely inappropriate for most western plant communities.
The plant communities with long fire rotations (time between major fires) include mixed conifer forest, aspen, all fir forests, old-growth Douglas fir forests, sagebrush, chaparral, lodgepole pine, and numerous other plant communities. Some grasslands, like native bunchgrasses, can not tolerate frequent fire and need up to 10 years to recover from a burn. Burn these grasses too frequently, and you encourage growth of annuals like cheatgrass.
Prescribed burns do not preclude “huge fires,” such as blazes, typically characterized as “bad” fires, which are part of most ecosystems’ natural fire systems and not some kind of aberration. Indeed, they are a critical part of “healthy” ecosystems. Precluding such blazes degrades many ecosystems.
The prescribed burning or cultural burning narrative is primarily a myth. This narrative that human ignitions are necessary for healthy ecosystems has become entangled with the social movement that seeks to champion minority and disadvantaged groups like Indian tribes.
Tribal people who were abused and disenfranchised in the past, and to a degree today, are now “celebrated” for their “ancient wisdom,” often termed “Traditional Ecological Knowledge.” It is part of the broader narrative that seeks to suggest that all of North America was under tribal management or a humanized landscape. Some scientists dispute the entire premise, even for eastern forests.
Tribal people used fire to achieve several “human-centered” outcomes, favoring vegetation that would attract big game they hunted or consumed, like berries. Fire also cleared lands around their villages and travel corridors to make movement easier. These are all anthropocentric justifications for fire.
However, it is not a biocentric perspective, and it is a stretch to suggest that most burning was done to “improve” the ecosystem’s health. Indeed, frequent human ignitions in most ecosystems harm the plant communities.
Most plant communities existed for millions of years before humans colonized North America, so it’s a stretch to suggest they now need human ignition to be healthy. For instance, ponderosa pine has existed as a separate species for over 50 million years. However, many cultural burning advocates argue that humans must burn these forests to keep them “healthy.” That begs the question of how they survived all these millions of years before any people were on the North American continent.
To the degree that human ignitions “added” to the area burned, most research suggests such cultural blazes were localized in their influence. In other words, they did not significantly influence significant plant and animal communities at a landscape or evolutionary level.
We have plenty of evolutionary evidence for this since many plant communities have no unique adaptation to fire and typically have long fire rotations. Frequent fire degrades or destroys these vegetative communities.
A good example is the sagebrush steppe, one of the most widespread plant communities in the West, which once occupied hundreds of millions of acres. Sagebrush ecosystems experienced fire, but typically at 50-400 year intervals, far longer than the frequent (1-10 year) rotation suggested by cultural burning advocates. When sagebrush burns, it usually dies.
The fact that hundreds of millions of acres of the West were dominated by sagebrush suggests that tribal burning didn’t significantly influence this landscape. Furthermore, the evolution of sagebrush obligate wildlife species like sage grouse, pygmy rabbit, sage sparrow, and others is additional evidence that “cultural” burning was unimportant in sagebrush ecosystems.
Nevertheless, that has not kept many cultural burning advocates and many federal and state agencies from advocating frequent burning of sagebrush because somehow it reflects the “wisdom” of tribal people.
The other side of the “good fire” or “bad fire” narrative is the idea that high-severity blazes are somehow “destructive” and need to be precluded. Again, this is a flawed narrative. High-severity fires are the norm in most western ecosystems. The only exception may be Southwest ponderosa pine landscapes, where frequent fire (typically due to abundant lightning) may have had a landscape-scale effect.
However, it is essential to point out that even among ponderosa pine forests, mixed to high severity blazes were relatively common in other parts of the West such as the Colorado Front Range, eastern Oregon, the Black Hills and the northern Rockies and elsewhere.
High-severity blazes provide a habitat type that frequent fire does not produce—snag forests. These forest types are relatively short-lived and may be as crucial as old-growth forests. Many species of plants and animals are only found or more abundant in snag forests.
For instance, many mushroom species are favored by such blazes. In numerous ecosystems, high-severity blazes improve the habitat for fish, leading to more abundant and larger fish. Numerous bird species are only found or more abundant in snag forests, including house wren, mountain bluebird, blackback woodpecker, and even robins. Other wildlife whose habitat is improved by snags and downwood include fishers, marten, grey foxes, and even bears. Several studies have shown that grizzlies and black bears will preferentially tear up down logs for ants, an important food source in summer.
Plants that increase after major high-severity blazes include aspen, paper birch, and willows.
Snag forests are among the rarest habitat types in the West because most snags tumble to the ground within a few decades. At this point, they become part of the landscape’s long-term biological legacy, taking centuries to mold into the soil entirely.
It’s critical to understand that such inputs of downwood into the forest or even aquatic ecosystems are relatively rare. A large, high-severity blaze may provide the only significant input of down logs in a century or two.
The prescribed or cultural burn narrative is an improvement over the old all fire, a “bad” concept that once dominated the public discussion. Still, in many ways, it is as nearly as harmful to ecological thinking and management as all fire is a bad concept.
We must revise our thinking to understand that expanding high-severity blazes is critical to many ecosystems. Climate conditions, not fuels, drive them. We are and will continue to see more acreage charred with climate warming. Still, such events are not catastrophic, destructive, or a disaster but a necessary evolutionary agent in many plant and wildlife communities.
Comments
Speaking of context, what you write may be true for the Rocky Mountains or even the intermountain West. But it is absolutely untrue for California and Mexican chaparral and the ponderosa forests of the west coast ranges such as the Sierra Nevada and the Sierra San Pedro Martir. The good fire “myth” is real in many locations. For example, it is very eye opening to compare the chaparral and forests just north and south of the Mexican border in California. North of the border there were the funds (and financial incentives) to conduct intensive fire suppression over the past century. While to the south very little fire suppression took place. If one compares the landscape around Mt Laguna east of San Diego to the landscape of the mountains east of Ensenada the latter are a patchwork quilt of chaparral and forest in different stages of succession while the former consists of vast acreages (100,000 square acres or more) of uniformly aged plants. The first time I saw it, the remarkable difference jumped out at me. The Mexican ecosystems certainly looked healthier.
One other minor quibble… I graduated from UC Davis in 1977 with a degree in Plant Science (Agronomy). As a part of that degree I took courses in both Range Management and Forest Management. Both professors delivered units on prescribed burns for fuel reduction. Those courses were in 1976, so by the mid-seventies there must have been quite a few “hands” worth of scientists who already then were advocating the use of fire as a tool in resource management if the concept had made it to undergraduate curriculum.
Reflecting on another comment you made, perhaps the difference between the impact of fire on Rocky Mountain ecosystems and California ecosystems could be due to the great difference in pre-European contact populations in the two areas. Just as we see now, the population and human impact was significantly larger in California than in the Rockies, and human ignition sources along with intentional burning was likely much more extensive.