Photo Credit: Anecdotal and scientific evidence tell us that humans have long had a role in shaping fire regimes. Photo by Coconino National Forest via Flickr Creative Commons
In January, I used my Science Tuesday post to vent about the notion of nature and its inherent and unproductive separatism. In February, the media went wild covering a paper (Balch et al. 2017) that had just been published in the Proceedings of the National Academy of Sciences (PNAS). That paper, led by a team from the University of Colorado and the University of Massachusetts, looked at wildfire ignitions in the United States from 1992–2012, and it found that humans were the cause of 84 percent of wildfires during that 20-year period. I’m sure you all remember seeing coverage of it — the story got picked up by Smithsonian, High Country News, Popular Science and dozens of other papers across the country. And it reinforced some of the same ideas that I questioned in January: that we humans are exerting an unnatural influence on our environment, that fire is a manifestation of our destructive disconnect with nature, and that our relationship with fire is a recent one. Powerful stuff, but I have to say, it didn’t sit well with me.
Don’t get me wrong — the data presented in Balch et al. are really interesting. They used a spatial database of wildfires in the U.S. (Short 2014) to assess the relative importance of human and lightning ignitions for 1.5 million wildfires that occurred between 1992 and 2012. They found that human ignitions were responsible for almost 1.3 million of those fires, and that human-started wildfires dominated (i.e., caused more than 80 percent of fires) across 60 percent of the country. Lightning-started fires dominated only the mountainous regions of the western U.S., whereas the eastern U.S. and western coastal areas were all dominated by human-started fires (see figure). These numbers are impressive, and certainly offer food for thought regarding wildfire awareness and prevention.
However, it is in the interpretation of these data that things start to get tricky. The authors argue that “human ignitions vastly expanded the extent of wildfire,” and that “human-related ignitions more than tripled the length of the wildfire season.” There is an implicit, and I would argue problematic, assumption in these statements: that the only “natural” fires are those that are started by lightning. The use of past tense — expanded, tripled — implies some sort of historical baseline, yet the analysis actually uses lightning patterns as a proxy for historical, “natural” fire regimes. Their analysis showed that humans are starting fires in areas that wouldn’t burn from lightning, and at times of year that are outside the lightning-caused fire season.
And it’s this that I’ve been struggling with: is a baseline of lightning ignitions realistic or fair? Can we rightly assume that any fire that wasn’t lightning-ignited was unnatural or outside of the historic range of variability? I would say no. We have strong evidence that human activity has dictated fire regimes for centuries (and probably millennia); see, for example, Taylor et al. 2016, which I covered in my January blog. And we know that in many of our most fire-adapted plant communities — deciduous oaks, longleaf pine, prairies and rangelands, mixed-conifer forests and even coastal redwoods — anthropogenic fire was and is a critical force of creation and maintenance. Without human-started fires, many of our most beloved and biodiverse habitats wouldn’t even exist.
So I ask this of you: how can we reconcile the results of this paper with the rich cultural connections between people and fire? And how do we factor the fire deficit into this story? Humans may be starting fires, but we are also putting them out, and all of our human-ignited fires haven’t made up that difference. So how do we, as humans, fit into the “natural fire niche” that the paper talks about? Are human-started fires only “natural” if they’re mimicking historical patterns, or can we accept that our actions as humans are — for better or worse — inherently natural, because what else can they be?
References:
Balch, J. K., Bradley, B. A., Abatzoglou, J. T., Nagy, R. C., Fusco, E. J., & Mahood, A. L. (2017). Human-Started Wildfires Expand the Fire Niche Across the United States. Proceedings of the National Academy of Sciences, 114(11), 2946-2951.
Short, K. C. (2014). A Spatial Database of Wildfires in the United States, 1992-2011. Earth System Science Data, 6(1), 1.
Taylor, A. H., Trouet, V., Skinner, C. N., & Stephens, S. (2016). Socioecological Transitions Trigger Fire Regime Shifts and Modulate Fire-Climate Interactions in the Sierra Nevada, USA, 1600–2015 CE. Proceedings of the National Academy of Sciences, 201609775.
Please note that comments are manually approved by a website administrator and may take some time to appear.
My answer to your question “is a baseline of lightning ignitions realistic or fair?” is an unequivocal NO. My issue with the assumption that the only natural fire is caused by lightning (although as you point out that is problematic) is the ahistorical nature of their analysis and that they don’t appropriately recognize the limitations of the fact that their results only reflect a, recent, 20-year data set. The issue isn’t the need to reconcile the results of this study with the rich cultural connections of humans with fire, rather the issue is that the paper over-extrapolates its conclusions and fails to situate their results within the larger context of the body of knowledge about conditions at other points in time. Had they done so, they might have seen the ‘strong evidence’ you reference and reached more appropriate conclusions: that their results aren’t anything new, perhaps the dynamic is different, but humans have long been a cause of a significant portion of fires in North America. This would perhaps have allowed them to recognize the long history of Native Americans ‘expanding the fire niche’ as well as the more recent role European culture has had in removing fire from the system ………