When Wildfires Heat Up, Size Isn't What Matters Most

Cloud computing is transforming the science of wildfire severity

By Emily Shepherd

October 8, 2022

Aerial view of a dead trees standing in a burned forest

Photo by JasonDoly

Each summer in recent memory has spawned a record-breaking, gargantuan megafire. Between April and June of this year, the Hermits Peak Fire burned 534 square miles of mostly forestland to become the biggest wildfire in New Mexico history. In 2021, the 1,505-square-mile Dixie Fire blazed through California's previous size record for a contiguous wildfire. 

Big acreage translates to big news, partly because it's easy to contrast the size of fresh burns with those of previous fires. But fire size is only part of the story—bigger isn’t always a bad thing. Policies on many federal, state, and private lands actually encourage firefighters to grow wildfires bigger, if safety permits.

"It's how they burn, not how much they burn in acreage," said Andrea Thode, a professor of fire ecology and management at Northern Arizona University who studies the complex science of burn severity. Burn severity is an assessment of a fire's impact measured in tree mortality and soil damage. Low-severity fires do not significantly change the forest in the long term; they are often an integral part of the ecosystem. High-severity fires, those that kill more than 95 percent of the trees in an area, bring profound transformation, and the scarred lands in their wake can resemble a moonscape. 

"The implications of high-severity fire, along with climate change, are very serious," Thode said. For a long time, researchers had a limited idea of how much high-severity fire was happening. That’s because while understanding burn severity is essential to predicting fire's impacts on future forests, it's a monumental task. Historically, the research required strenuous, expensive, and time-consuming fieldwork across a vast landscape. Researchers would have to travel to remote locales and take measurements in the ashes. 

In recent years, scientists like Thode have taken a different approach by tackling the problem with satellite imagery and cloud computing tools that are now changing the field. As a result, new studies now offer the most comprehensive view of burn severity we've ever had, with deeply concerning results.

Back in 2002, Thode was beginning one of the biggest wildfire severity research projects in the world, measuring tree mortality, char height on trunks, and soil damage on over 900 sites in 15 national forests. Still, the colossal acreage burning yearly dwarfed the project’s already massive scope. She knew she needed a different set of tools to get a broader perspective.

"The logical choice when you're talking about that much land area is to use remote sensing," she said. By that point, satellites had been blasting home data from orbit for decades, but nobody had found a way to relate the satellite images to burn severity on the ground. Then, right around the time Thode realized she needed such a tool, researchers Carl Key and Nathan Benson invented one.  When Thode put their new tool to use, she realized there was a catch: It was blind to extreme fire severity in certain forests, particularly ones with sparse tree cover. 

In satellite imagery, the difference between sparse, living forests and their dead counterparts looked minimal even after a major burn. Following such fires, analyses were spitting out moderate severity values. Sometimes, on the ground, every plant would be dead.

It didn't take long, however, for Thode and colleague Jay Miller to solve the mystery. In 2007, Thode and Miller published a new method for measuring severity, relativizing forest data from before and after wildfires. The Monitoring Trends in Burn Severity program, a national interagency program introduced two years earlier, adopted the new method. Today, MTBS has mapped the severity of all large US wildfires, more than 20,000 of them, from 1984 to the present. There is no other program like it in the world.

Now we know that as US fires have grown in size, the footprint of high-severity fires has grown too, Thode explained. In some places, the two have expanded in tandem. But in the Southwest, the percentage of high-severity burn in each wildfire is increasing, amplifying the trend.

That could spell calamity for the region's iconic ponderosa pine forests. The trees don't reseed the forest floor after a severe fire, so one extreme burn can banish them altogether. Today, 20 years after the Rodeo-Chediski Fire burned over 700 square miles of Arizona, some severely impacted areas are eroded to bedrock. Others are dominated by Gambel oak and alligator juniper, which are beating out ponderosa stands in the race to repopulate the worst parts of the burn. 

And the climate crisis can compound such effects. Compared with a mature tree, a seedling's shallow roots and thin bark are more susceptible to drying out in inhospitable conditions. The hotter, dryer weather that comes with climate change can make it harder for seedlings to grow back after severe wildlife.

Innovations in the field are ongoing, and researchers are still working to build better tools for analyzing burn severity. Another breakthrough took place in 2018 when Sean Parks, a wildfire ecologist with the US Forest Service, published a method using the power of cloud computing in Google Earth Engine (GEE). Conclusions about large-scale changes in fire severity, or their relationship to climate change, can't be drawn from MTBS data until scientists analyze it. While MTBS is a database and mapping program, GEE is, among other things, a powerful enough computer to do the math. 

Parks' idea was to use GEE to create a single composite image for the pre- and post-fire landscape by averaging many images. This eliminated what had previously been the most laborious and subjective step in calculating severity: hand-selecting both pre- and post-fire images. In 2020, Parks and colleague John Abatzoglou used the new method to publish the most extensive study to date—covering all forested ecoregions of the West—linking the long-term increase in wildfire severity to climate change.  

These results, like Thode's, were concerning. "What we found was an eight-fold increase in annual area burned at high severity," Parks said. "What we're seeing now, especially with climate change, is that these forests are having a more difficult time recovering." 

Even as new US data emerges, scientists are turning the latest tools to global forests in hopes of learning how fire severity is trending worldwide. To that end, Parks' approach is freely available, making severity mapping accessible to international researchers. Since MTBS, which only maps US wildfires, is the only program of its kind, the GEE method is likely the easiest way for wildfire ecologists to track large-scale changes in severity in other countries from now on. 

Researchers in Mexico and Canada have already used the GEE mapping tools to assess their own forests. "Now," Parks said, "anybody in the world can make maps of burn severity."