Estimate: Half of area burned since 1984 probably did so due to warming.

Wildfires in the American West can make for apocalyptic images, but they’re also routine, as the heat of the dry season can turn large areas of forest into fires-in-waiting. One lightning strike—or one careless human—can set off a blaze that consumes tens of thousands of acres.

Several factors contribute to the extent of these wildfires. We've made efforts to put them out as soon as possible—it's well intentioned and sometimes necessary to protect ever-expanding human communities. But in many places, putting out the fires has disrupted a natural process of forest housekeeping. With small bits of fuel allowed to accumulate on the forest floor for longer, fires become less frequent but much more intense.

Climate also plays a role. Year-to-year variability leaves some summers noticeably drier and hotter than others. And then there's climate change. What can we say about its influence on fires in the West?

Previous research has determined that wildfires are already worse because of global warming. In a new study, John Abatzoglou of the University of Idaho and Park Williams of the Lamont-Doherty Earth Observatory take this analysis one step further by attempting to quantify global warming's influence.

To do that, the researchers used a number of metrics that track drought and the dryness of burnable forest material, as well as the area of forest that actually burned each year. Those aridity metrics correlate very strongly with wildfire area, tracking the ups and downs from one year to the next. So if aridity metrics can be related to climate change, then it should be possible to measure the latter's contribution.

The human-caused trend was calculated using the smoothed average of a large group of climate model simulations, focusing on changes in temperature and evaporation potential. Each aridity metric was calculated for the model average, allowing for an apples-to-apples comparison. The difference between the smooth model trend and the wiggly observations shows the contribution of natural climate variability.

The dryness of fire fuel has increased over the last century—and mostly over the last 35 years. About 55 percent of that increase since 1979 is due to climate change, with the rest being the result of natural variability like the sluggish climate see-saw called the Pacific Decadal Oscillation. The length of the fire season increased by about two weeks over that time period because of climate change alone. By the time the 2000s rolled around, the area of forest at high fire risk was about 75 percent higher than it would have been without global warming.

As for the areas that did end up burning, the researchers estimated the amount of territory that burned due to climate change alone between 1984 and 2015. It turned out to be an area the size of Massachusetts and Connecticut combined. That just about doubles the total amount that would have burned due to natural weather variability.

So climate change didn’t just “play a role”—it co-starred, at least.

Like all estimates, this one relies on some simplifying assumptions. It focuses on human-caused changes in average temperature, leaving out possible influences on weather variability, precipitation, and wind. It doesn’t account for the impacts of climate change on the mountain snowpack that melts over the summer, for example, or the pine bark beetle population boom that has left vast areas of trees standing dead. And warming may even be increasing the number of lightning strikes that can touch off fires. So it could be that climate change is responsible for a slightly smaller share of wildfires—or an even larger share.

There are several things we could do to limit wildfires in the American West, but halting climate change is clearly one of them. The researchers write, “We expect anthropogenic climate change and associated increases in fuel aridity to impose an increasingly dominant and detectable effect on western US forest fire area in the coming decades while fuels remain abundant.”