We Choose to Reduce Wildfire Danger, Not Because It Is Natural But Because It Is Good
Relying on the Naturalistic Fallacy Confuses the Case for Proactive Land Management
By Patrick Brown
Fire season is ramping up into full gear in the western United States (and Canada), with widespread impacts on lives, property, and health via exposure to smoke. These worsening impacts have become routine in recent years and are part of a trend over recent decades. In the continental US, for example, approximately 2 million acres of land burned in a typical year in the 1960s through the 1990s, but that number has more than tripled to around 7 million acres per year over the last couple of decades.
Humans start about 85% of wildfires, and thus the expansion of settlements into more fire-prone areas has played a role. However, the two main drivers of the increase in wildfire activity over the past several decades are a warming and drying climate and the long-term build-up of excess vegetation in much of our forests.
One primary strategy for dealing with the increase in wildfire danger is to engage in proactive land management—namely, prescribed burning and mechanical thinning to reduce hazardous fuels and mitigate wildfire danger.
The case for doing this, often referred to as Forest Restoration, includes “activities that…return forest composition and structure to a more natural state…” The argument goes that we have erred by excluding fires and introducing invasive grass species, which have created unnatural, unhealthy conditions; therefore, we must actively return the landscape to more natural conditions.
In reality, the fire regimes in the U.S. West prior to modern fire exclusion policies were heavily influenced by Indigenous burning practices. However, these practices, too, get implicitly categorized as being essentially natural due to perceptions of Indigenous people living in harmony with nature.
Ironically, the case against proactive land management is also made in the name of naturalness. Environmental groups that seek to stop active land management projects typically do so on the grounds that human intervention moves the landscape from a perceived ideal natural state to a less ideal unnatural state. Many environmentalists argue that we must neither manage forests nor prevent fires and rather let “nature” simply return to its “natural order.”
Both the case for “natural” passive land management and the case against “unnatural” active land management mistake the core reason for wildfire mitigation: to protect human lives, property, and health while balancing other interests like habitat maintenance for certain species, recreation, sustainable timber harvesting, etc.
“Natural” is an arbitrary category and not necessarily good
Using the degree of naturalness as an argument for or against active land management is an application of the naturalistic fallacy, which asserts that what is natural is inherently good.
The naturalistic fallacy is omnipresent in contemporary discussions around energy, environment, and food issues. For instance, “chemicals” and genetically modified organisms are often vilified for being "unnatural," while organic farming is praised despite its lower resource efficiency and higher land use. Similarly, renewable energy sources like wind and solar get favored over nuclear power in many circles due to their perceived naturalness, despite nuclear power's lower land and material footprint and higher reliability.
But the naturalistic fallacy is a fallacy for a reason. First, it is not at all clear whether it is useful or even possible to definitively distinguish between what is natural and unnatural. Humans are one branch on the tree of life, just like every other species. Our bodies are made of materials from the earth and powered by sunlight, just like every other species. We rearrange and recombine materials from our environment for our own self-interest, just like every other species. This manipulation results in things like sunblock, cement dams, and modern skyscrapers. But it is difficult to argue that a skyscraper is fundamentally different from a bird's nest or a spider’s web unless you arbitrarily define activities conducted by one particular species—humans—as being somehow unnatural.
Even if a clean line could be drawn between unnatural humans and the natural world, in what sense should nature deserve the reputation of being inherently good? Cancer, viral infections, heart disease, volcanic eruptions, tsunamis, and earthquakes all undoubtedly cause great suffering to conscious creatures who take no solace in the fact that these things are natural.
Even the parts of nature that are presented as being “in harmony”, like a basic ecosystem food web are not so obviously good when interrogated. The food web gives the impression that every creature plays its part in maintaining a benevolent and perpetual balance. This creates the idea that each organism's role is to sacrifice for the greater good of the ecosystem.
However, the true underlying dynamic of an ecosystem is an unrelenting competition between individuals of the same species and between species for finite resources. The existence of carnivores is often taken for granted, but it’s worth reflecting on what carnivores are actually doing: Competition for resources within the ecosystem is so intense that species evolved to kill and consume each other to forcibly take their resources and incorporate them into their own bodies. The prey do not consent to this fate; they live in a constant state of resisting being eaten throughout their lives until, one day, they fail and are eaten or they die from some other “natural” cause. Far from considering themselves to be noble components of the ecosystem, they would desperately want out of this system if given the choice.
This is not to say that nature should be disdained, but rather to make the point that nature itself is amoral and does not deserve a default status as being synonymous with “good.” Wildfires are natural in the sense that they have been present since soon after the appearance of land plants about 400 million years ago. Some species have been selected to take advantage of them. But none of this means that wildfires are desirable to humans today.
Ultimately we decide what we consider to be good through innate emotion and higher reason. We seek to persuade each other of our own versions of what is good through stories and logic, but people fundamentally disagree on values and priorities, so consensus is impossible.
Actively manage the land to optimize desired outcomes, not to optimize naturalness.
Even if we will never reach a universal consensus on the optimal balance of tradeoffs in land management decisions, some principles will have wider endorsement than others. As people, it is not surprising that we would want to prioritize human interests that enjoy broad support, including water quality for municipal use, habitat maintenance and pest management (inevitably favoring some species that we prefer over others), aesthetics for recreation and tourism, timber harvesting, and wildfire mitigation.
Previous forest management regimes sought to maximize human benefits by excluding fire completely, but this has backfired. In 1911, the federal government adopted an official policy that put an end to the controlled burns that had been practiced by many Indigenous peoples, and in 1935, the “10:00 am” policy was put in place that sought to make sure all fires were extinguished by mid-morning the day after they were detected. Initially, this worked extremely well, as the amount of acreage burned fell from close to 50 million acres per year in the US in the early part of the century to the aforementioned 2 million acres per year in the late part of the century. But there were unintended consequences.
The lack of fires meant that locations that would previously burn perhaps once per decade have now gone over a century without fire, leading to a buildup of small trees and brush. For example, in the 1800s, there were about 50 trees per acre in U.S. West Ponderosa pine forests, but now that number is approximately 200 trees per acre. The buildup of “hazardous fuels” for fires is like the stretching of a rubber band— potential energy has built up, ready to be released all at once the moment a fire inevitably occurs. Compounding the problem is that more vegetation means increased competition for finite resources (like water and sunlight) that make ecosystems more susceptible to disease and sensitive to drought enhanced by a warming climate. It also means that fires are more difficult to fight because it’s more difficult for firefighters to access them. And that fires burn more intensely and produce more smoke harming human health, and ash harming municipal water supplies. Finally, this buildup increases the likelihood that fires will climb into the canopies of forests and are thus more likely to kill mature and old-growth trees.
Mechanical thinning combined with prescribed burns for wildfire mitigation has become a priority because, for the most part, wildfires are bad for human interests. Not only is there an unprecedented number of people and structures at risk of being directly affected by wildfires (structures adjacent to the wilderness have increased in the US by 60% from just the 1990s to 2020), but we are continuously learning more about the negative impacts of wildfire smoke on human health and mortality and other indirect effects that lead to enormous economic burdens of hundreds of billions of dollars per year.
In this context, we must be extremely careful about reverting to the more “natural” fire regime that existed prior to the era of fire exclusion. Fire activity in the late 20th century likely marked a minimum compared to at least the past 3,000 years. As one study on prehistoric fires in California put it:
The idea that US wildfire area of approximately two million ha annually is extreme is certainly a 20th or 21st century perspective. Skies were likely smoky much of the summer and fall in California during the prehistoric period.
But going forward, the situation is unique. We have an excess of hazardous fuels on the landscape, the climate is becoming more conducive to catastrophic wildfires, and roughly two orders of magnitude more people live in the U.S. West than there were prior to European colonization.
We can and should fight fire with prescribed fire (i.e., controlled burns) because it reduces subsequent wildfire intensity and it reduces the amount of smoke released compared to an uncontrolled wildfire. Not only that, but prescribed burns are done during conditions when prevailing winds will minimize human exposure to smoke. For this to work optimally, prescribed fire must often be paired with prior mechanical thinning of the landscape to physically remove brush and younger small-diameter trees so that the fire safely stays low and does not climb into the canopies of forests.
Conducting these hazardous fuel treatments at scale will reduce wildfire danger, and our motivation for embarking on this project is the enormously detrimental impact of wildfires on people, our health, our property, and our economy. Our motivation is not based on adhering to a misguided notion that the most natural state of things is inherently the best state.
Thanks for this important post. Fire suppression in western forests has led to buildup in forests and destructive crown fires. Prescribed burns can mitigate this. Prescribed burns, however, should not be applied to chaparral, whose recurrence of fire is far less often than that of western forests.
The question is, was the change in forest management made will full understanding of the costs and benefits for "principle" or is it just a garden variety failure to use cost benefit analysis when making a decision. In other words, did being or not being "natural" have anything to do with the decision?