How do we evaluate wildfire risk?
January 17, 2024
Kaileen McCulloch
Software Developer
You may have already seen this graph or some variation of it depicting a two and a half times increase in the number of hectares burned in Canada compared to the previous record. This year, as of November 1, 2023, a total of 18,401,197 hectares have burned. That is roughly equivalent to the entire state of Washington!
Considering that metric alone, it would be difficult to make a case against the substantial increase in the risk of wildfire. Yet, how do we actually quantify wildfire risk? In Canada we have developed a system for assessing risk called the The Canadian Forest Fire Danger Rating System (CFFDRS). It is composed of two different components: Canadian Forest Fire Weather Index (FWI) System and the Canadian Forest Fire Behavior Prediction (FBP) System. The FWI is used to determine the likelihood that a fire starts, where as the FBP calculates the effect a fire could have once it has started.
At Path and Focus, we're very interested in understanding wildfire risk. We've built a danger rating tracking tool called Beacon where crews can easily see current and forecasted danger ratings and calculated work restrictions. We thought it would be interesting to take the data that backs Beacon and plot it against historical data to see if the same type of spike exists in the weather data and subsequent FWI values.
A reasonable hypothesis for justifying this extreme fire season would be to blame the weather, right? However, these plots show a different story. The extreme spike we see in number of hectares burned is missing when looking at any of the indices from the FWI system. The data is an accumulation all of the weather stations in British Columbia and these values are derived from four weather data inputs: temperature, wind speed, precipitation, and humidity.
We've only shown a subset here but if you're interested in looking at the rest of the indices you can browse them yourself in our Observable Notebook.
The only one that starts and remains highest throughout most of the season is the Drought Code which is an indicator of the moisture content in deep duff layers and large woody debris. A high drought code can lead to persistent fires because they burn deeply yet it doesn't always correlate to fire starts. The Fine Fuel Moisture Code relates more strongly to likelihood of ignition yet also has a greater daily fluctuation and is therefore hard to observe yearly trends.
The FBP is where we see indicators for spread rate and fuel consumption but these are based on many more factors including fuel type and fuel states such as greenness, percent curing, and crown height as well as land features such as slope, latitude, longitude, and elevation. Unfortunately, from our weather data alone we cannot calculate these values to plot but if we could would we find the missing discrepancy there?
“In a nutshell, it's a lot drier out there than what [the] FWI is telling us” say Robert W Gray, Wildland Fire Ecologist. The FBP uses a rudimentary start up procedure and is based on research done decades ago. This means that it's assuming a fuel moisture content significantly lower than what we're seeing today. The CWFIS even warns that the fuel types are “not suitable for operational fire management because of the moderate resolution and limited scope of the input data.” An example of this is the impacts we're seeing with living vegetation or live fuels. Historically we have considered live fuels to lack the potential for fire ignition. However, as our temperatures trend upwards this is quickly becoming a false assumption. In the spring, fuel is drying out quicker and in the fall, as large scale fires continue to burn past frost, we are seeing these live fuels play a major role in late season fire behaviour as well.
In a recent article released in Nature, wildfire experts look at additional factors that are changing the wildfire equation. These are anthropogenic factors such as fire suppression and land management practices. For the past century, we have taken a hard stance on fire suppression. Our default behaviour has been to extinguish fires and with good reason — these efforts save lives and livelihoods. This is especially crucial as our wildland-urban interface continues to expand, increasing the human risk if we let fires burn. However, fires have many benefits on the landscape. For one, they help control pest species like the mountain pine beetle, which is killing our trees and leaving behind more volatile fuels. Without the use of fire, forest harvesting practices lead simply to a rearrangements of fuels, not a reduction of fuels. Logging takes crown fuels and sets them on the forest floor. The crown thinning reduces the likelihood of crown fire spread but it increases the amount of fine fuels available for combustion and consequently leads to higher surface fire intensity. Prescribed burning of logging slash is needed to effectively reduce this potential. The other benefit of this practice is that it facilitates the retention of more fire-resistant species. Yet instead we are observing that our forest management practices are doing the opposite; we are changing the diversity of the ecosystem towards more flammable species.
So if the world we live in today doesn't match our assumption from the past how are we supposed to use historic standards to make predictions for the future? Learning to accurately measure or predict wildfire risk has become extremely important as fires continue to threaten our safety more and more. How do we update our current approach to strengthen our situational awareness? A question we, at Path and Focus, are wrestling with every day.