Despite its long history of wildfires, Canada still doesn’t know how to live with them

In the fall of 1922, the city of Toronto sent 85 surplus streetcars to Haileybury and other northern Ontario towns to help house thousands of desperate people who had lost their homes to wildfires.

Known as the Great Fire, it burned nearly 1,700 square kilometres of the area — including the town of Haileybury. It killed 43 people and caused millions of dollars in property damage in 18 townships. A newspaper referred to it as the “worst disaster that had ever overtaken northern Ontario.”

It was not.

The wildfires back then were as fierce, deadly and eerily similar to the ones we have today. And we have yet to learn to live with them.

Fires of the past

The Great Miramichi fire, which destroyed forests and devastated communities across northern New Brunswick in 1825, was the largest and one of the most deadly wildfires in North American history.

The Saguenay and Ottawa Valley fires in 1870 could have been just as deadly when they forced the evacuations of several thousand people. The capital city would have burned down that summer had it not been for a quick-thinking engineer who ordered the gates of the St. Louis dam on the Rideau Canal to be breached so that it would flood city streets.

Seventeen villages were levelled in Wisconsin the following year, killing between 1,200 and 1,500 people.

In 1881, the Michigan’s Thumb fires burned 1,480 barns, 1,521 houses and 51 schools, while killing 283 people and injuring many others. Smoke from those fires coloured the sky over Toronto.

In 1908, the British Columbia town of Fernie was levelled by a wildfire. In 1911, the Porcupine fire killed 73 people while levelling the towns of South Porcupine and Pottsville in Ontario before partially destroying Golden City and Porquis Junction.

There was almost no warning five years later when a deadlier complex of fires swept through the same region and killed 223 people.

Each summer and fall, it seemed, ended badly somewhere.

Déjà vu

The similarities between the fires now and then are uncanny, as described in my book Dark Days At Noon: The Future of Fire. The ignition of fires between 1870 and 1922 was fuelled by higher temperatures, drier forests and the kind of elevated lightning activity that we are experiencing today.

Much of the warming back then can be attributed to the end of the little ice age (1300 to 1850) that dramatically cooled parts of the world, and the Industrial Revolution in the late 18th and early 19th centuries.

Today, the unprecedented warming taking place is primarily because of the burning of fossil fuels.

Forest land-grabbing and negligence has also fuelled numerous fires in the past and present.

Before and beyond the turn of the 19th century, people moved into boreal and temperate forests to take advantage of cheap land and jobs in the mining and forestry sectors. Today, people are building luxurious country homes in places like the Okanagan to escape the cost of living in big cities.

Sparks from trains and the careless disposal of locomotive ash accounted for a significant number of fires in Ontario in the past. Following the Lytton fire in B.C. in 2021, the head of Canada’s Transportation Safety Board acknowledged that more work is still needed to prevent wildfires caused by trains.

Gaps in public policy

The other thing that hasn’t changed much is public policy. The Porcupine fire in 1911 as Canada’s version of the Big Burn, a complex of fires that swept through the northern Rockies of the United States in 1910 and resulted in sweeping policy changes.

A black and white image of a mountain on fire
The destruction caused by the Big Burn of 1910 pushed the U.S. to revamp its wildfire management strategy. (Forest Service Northern Region/flickr), CC BY

Following the Big Burn, the U.S. passed the Weeks Act that authorized the government to purchase up to 30 million hectares of land to protect watersheds from development and wildfire. This mandated the U.S. Forest Service to work with state fire bureaus, which were happy to co-operate because it came with funding they could not otherwise afford.

In contrast, Canadian politicians failed to do what was necessary to prevent future fires. The government, which owned many of the railroad companies, blamed Indigenous people for many fires. Better legislation and fire management strategies were still not in place five years after the Porcupine fire when the Matheson fire took the lives of 223 people. Nor were they there in 1922, when the Great Fire devastated Haileybury.

Canada had a chance to replicate what the U.S. Forest Service was doing, but failed to as funding for fire research and management was badly decimated by budget cuts and the off-loading of responsibilities to the provinces in the 1930s.

Even today, provinces like Alberta have cut wildfire budgets to save money, only to pay the price when wildfires like the 2016 Fort McMurray wildfire, which forced the evacuation of 88,000 people.

Managing future fires

The fact that fire is still entering towns like Lytton and Fort McMurray without adequate warning suggests we have yet to learn to live with the fires that we have stoked by burning fossil fuels, draining wetlands and suppressing natural fires that would have otherwise produced more resilient forests.

Stopping Indigenous burning that aided forest regeneration didn’t help.

We are now in a unique situation where hot fires are creating their own weather — fire-driven thunderstorms and pyrogenetic tornadoes — that can spawn other fires. We saw this in Fort McMurray in 2016, in B.C. in the following years and in 2019 and 2020 when Australia’s Black Summer fire season led to a massive outbreak of fire-induced and smoke-infused thunderstorms.

This is, in a word, scary.

The title of my book Dark Days at Noon harkens back to 1780 when smoke from distant fires blocked out so much sunlight that people from all over New England thought the end of the world was at hand. The end of the world is not at hand, but there will be many more dark days at noon if we do not learn to live with fire.

Edward Struzik, Fellow, Queen’s Institute for Energy and Environmental Policy, School of Policy Studies, Queen’s University, Ontario

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Canada is witnessing more thunderstorm impacts than ever before

Gregory Kopp, Western University; David Sills, Western University, and Julian Brimelow, Western University

Residents in eastern Ontario are still recovering after a tornado-producing thunderstorm left a path of destruction over 55 kilometres long and up to 1,400 metres wide in July.

Such thunderstorms, and the damage they leave behind, can have deep and far-reaching impacts on society and the economy, and they are only increasing.

In Canada, the new normal for yearly insured catastrophic losses has reached $2 billion — a significant increase from the $422 million per year between 1983 and 2008 — and a significant chunk of that is from thunderstorm-related severe and extreme weather.

We at the Northern Tornadoes Project and the recently launched offshoot — The Northern Hail Project — are often asked whether these severe and extreme weather events are on the rise, and if this has anything to do with manmade climate change? The simple answer is: it’s complicated.

The difference between severe and extreme

Severe thunderstorms occur in Canada every year, bringing with them large hail, damaging downburst winds, intense rainfall and tornadoes. More rare and of even greater concern are extreme weather events — with their size, intensity or even time of year well beyond what is typically expected based on past observations.

Long, thin tornado from thunderstorm base to ground
Prairie tornado in D’Arcy, Sask. on June 15, 2021. (David Sills), Author provided

Extreme weather conditions include tornadoes causing damage rated EF3-EF5 and significant hail of over five centimetres in diameter. Extreme weather can also arise when large hail accompanies downburst winds — increasing the hailstone impact energy — or when a long-lived thunderstorm system results in a derecho, which is a cluster of downbursts (and sometimes embedded tornadoes) resulting in intense damage over hundreds of kilometres.

In September 2018, for example, a tornado outbreak in the National Capital Region caused catastrophic damage resulting in over $300 million in insured losses. It is also the latest in the year that a tornado outbreak with up to EF3 damage has been recorded in Canada.

In June 2020, Calgary experienced Canada’s first billion-dollar hailstorm and fourth costliest natural disaster on record, with insured losses of $1.3 billion. The derecho in May 2022 that mainly affected southern Ontario took 12 lives, with early estimates of insured losses close to $900 million. And that’s just over the last four years.

How can we detect these trends?

Such events and their impacts cannot be adequately assessed and documented using standard operational weather observation platforms such as radar and surface weather stations.

Tornado tracks and hailswaths are inherently narrow and often pass between stations. Radar can capture some of the key meteorology, but not the impacts on the ground.

Comprehensive storm surveys by weather and engineering experts are required to fully assess and document the meteorology and its physical impacts through what we call an “event-based approach”. In fact, we recently added a social science component to such investigations to better capture the impacts on people and communities. The living database that results from these storm surveys can always be updated as new information is discovered.

Map depicting a 2017 tornado outbreak in Québec
A map shows the starting locations and tracks of the 23 tornadoes that occurred during a two-day tornado outbreak in Québec in June 2017. (Lesley Elliott and Liz Sutherland/The Northern Tornadoes Project), Author provided

This approach allowed the Northern Tornadoes Project to uncover one of the largest recorded tornado outbreaks in Canadian history — 23 tornadoes over two days in Québec — and increase the number of tornadoes documented across Canada each year. It has also allowed the new Northern Hail Project to recover and document Canada’s largest hailstone on Aug. 1, 2022.

The greater the length and better the quality of a national database of these events, the more likely it is that any severe and extreme storm trends will be detected.

Some progress has been made

The tornado data for Southern Ontario is of sufficient length and quality to allow us to begin to look for trends. A 2022 study found that the annual number of tornadoes recorded there since 1875 has grown substantially. But that is mainly due to an increase in weak tornadoes — ones that might have gone unreported in the past but now fail to escape the attention of the expanding population with consumer-grade cameras at the ready and access to social media for sharing.

The same study found, however, that tornadoes rated F/EF2+ in southern Ontario occurred gradually later in the year since 1875, now peaking in late summer rather than early summer.

Meanwhile, in the U.S., studies have shown that tornadoes may be occurring in bigger clusters and starting to shift eastward – away from the Great Plains and into more populated areas.

In all cases, clear connections to man-made climate change have not yet been established. It is also yet unknown whether extreme storms are changing in ways that are different from severe storms. But it’s still early and research in this area is growing rapidly.

While storm trends are studied, prepare for increased impacts

Canadians are recording and sharing images and experiences of severe and extreme storms more than ever before, increasing the documentation of these events. As the population continues to grow and spread out, the damage and losses caused by thunderstorms will continue to grow.

Damaged cars are seen next to the remains of houses damaged by a tornado.
Damage from an EF2 tornado in Barrie, Ont. on July 15, 2021. (Northern Tornadoes Project), Author provided

At the same time, we are learning more about changing storm patterns and possible connections to climate change. Continuing to increase the length and quality of our national severe and extreme storm event database is needed to better understand such changes.

In the meantime, developing adaptation strategies to ensure resiliency and to lessen the impact of inevitable damaging storms is becoming increasingly important. Improving upon building codes and other policies to promote more resilient buildings and communities is urgently needed to better protect the lives and property of Canadians.

Gregory Kopp, Professor of Civil Engineering & ImpactWX Chair of Severe Storms Engineering, Western University; David Sills, Executive Director – Northern Tornadoes Project, Western University, and Julian Brimelow, Executive Director Northern Hail Project, Western University

This article is republished from The Conversation under a Creative Commons license. Read the original article.

What is a climate stress test? A sustainable finance expert explains

Imagine this: You take out a mortgage to purchase your dream home. But the rate you were quoted has expired, and when you go to renew it you find there’s been a major hike in interest rates. With this new rate, you are no longer able to afford your monthly payments.

How do you avoid this nightmare situation? The answer is a stress test.

In the simplest terms, a stress test helps individuals and institutions mitigate risk and make better decisions by playing out big economic shocks — like a major jump in interest rates or a global pandemic — to ensure they have what it takes to weather the storm.

A stress test is a “what if” exercise, where we contemplate scenarios that would pose the most harm to our financial systems and well-being in order to determine how we can best manage through them. They’re now being increasingly applied to future climate change and the financial risks that come with it.

Physical risks, transition risks

The 2008 financial crisis put the need for better risk planning into sharp relief, especially for financial institutions. It’s no coincidence that we have seen a steady rise in the use of this tool since that time.

Today, financial regulators, banks and policy-makers use stress tests to uncover weak points in how financial institutions operate and identify changes that will help buffer them (and our larger financial system and everyone who depends on it) from harm.

A chart showing the top 10 risks to the world in the next decade
Climate action failure, extreme weather events and biodiversity loss, are the top three global risks over the next 10 years, according to the World Economic Forum’s Global Risks Perception Survey. (World Economic Forum Global Risks Report 2022)

So, what’s a climate stress test? It is the same what-if exercise, conducted through the lens of different climate scenarios that have diverse and significant financial consequences.

On the one hand, there are physical climate risks. Think, for example, of extreme weather events, such as floods, droughts, ice storms or heat waves, that can damage property, disrupt supply chains, increase insurance costs, and shut-down operations. In scenarios where global temperatures rise higher, the physical risks increase.

On the other hand, there are also transition risks. This refers to the material impacts of various degrees of climate ambition and action.

For example, new or more stringent government policies aimed to further reduce carbon emissions or at a faster pace will have different financial impacts on different companies, depending upon their climate-readiness, and on different sectors.

Scenarios aren’t predictions

Climate scenarios take both types of risk into consideration, physical and transition. Like other types of stress tests, these scenarios aren’t predictions. Imagining what would happen if interest rates skyrocket isn’t the same as predicting that they will.

However, given the established scientific consensus that climate change risks are increasing and the high degree of uncertainty these risks create, climate stress tests are an important tool to assess the sustainability of companies, investments and our financial system overall. And there is increasing momentum behind this practice.

For example, the Office of the Superintendent of Financial Institutions (OSFI) and the Bank of Canada recently released a major report examining four climate scenarios over a 30-year horizon, from 2020 to 2050, that varied in terms of ambition, timing of global climate, and pace of global change:

  • Baseline scenario: A scenario with global climate policies in place at the end of 2019.
  • Below 2 C immediate: An immediate policy action toward limiting average global warming to below 2 C.
  • Below 2 C delayed: a delayed policy action toward limiting average global warming to below 2 C.
  • Net-zero 2050 (1.5 C): a more ambitious immediate policy action scenario to limit average global warming to 1.5 C that includes current net-zero commitments by some countries.

Physical risks dominate

The results of the analyses were clear.

First, delayed action will lead to higher economic shocks and risks to financial stability. The longer we wait to act, the more drastic and sudden those actions will be.

Second, while every sector will need to contribute to the transition, the analysis showed that “significant negative financial impacts emerged for some sectors (e.g., fossil fuels) and benefits emerged for others (e.g., electricity).”

Third, macroeconomic risks are present, particularly for carbon intensive commodity exporting countries like Canada.

The European Central Bank also conducted a climate stress test with similar findings. It determined that climate change represents a systemic risk — especially for portfolios in specific economic sectors and geographical areas. For example, in the mining and agriculture sectors, or in oil-dependent regions like the Gulf States.

It also found physical risks will be more prominent in the long run, compared to transition risks. The physical risks of climate change on real estate in coastal regions or on supply chains is expected to be greater than the effects of changes in carbon pricing or other policies.

These findings have clear implications for companies and investors. Now more than ever the business case for prioritizing and evaluating corporate climate resilience is clear, especially as investors and lenders increasingly incorporate climate data into their financial decisions.

For example, it is now more broadly understood how climate policy changes could abruptly impact a company’s valuation and financial outlook. This makes climate policy foresight critical, for corporate leaders and investors alike.

As climate stress tests become increasingly common, their findings and implications will reverberate across the entire financial industry. Savvy leaders will both watch this conversation closely, and take the necessary steps to adapt and thrive.

Ryan Riordan, Professor & Distinguished Professor of Finance, Research Director at the Institute for Sustainable Finance, Queen’s University, Ontario

This article is republished from The Conversation under a Creative Commons license. Read the original article.

We can’t predict the next wildfire disaster – but we can plan for it

Jen Beverly, University of Alberta

Intense, fast-spreading fires are an enduring and natural feature of Canadian landscapes, but for most of the past 40 years, relatively few residents were evacuated each year. Yet, in the past 10 years, an unprecedented number of homes have burned in Alberta and British Columbia.

Recently, a wildfire destroyed 90 per cent of Lytton, B.C. Residents had minutes to evacuate as the fire engulfed the village. Slave Lake and Fort McMurray have also suffered enormous losses within the past decade.

As a wildfire scientist, when I look at these disasters I don’t see isolated events, or even a trend, but an abrupt shift to a completely new state. Since 2011, Western Canada has experienced a succession of extreme fire seasons with prolonged threats that affect many communities and last weeks or months.

When I think about what unfolded in Lytton and elsewhere, I am reminded of American business magnate Warren Buffett’s advice on the need to prepare for adversity: “Predicting rain doesn’t count. Building arks does.” For me, this means that efforts to predict fire risk and to prioritize mitigation efforts are not enough. Now is the time to prepare for fire disasters — wherever they are possible — and to start deciding what we will do when they happen.

Evacuations were infrequent, untracked

Twenty years ago, there were no national statistics on wildfire evacuations. The 2003 Okanagan Mountain Park Fire that consumed 239 homes in Kelowna, B.C. first exposed how little we knew about the problem. Was it an isolated anomaly or a harbinger of what was to come?

In the years that followed, my colleagues and I began to compile details from newspaper archives and records from emergency response agencies gathered from 1980 to 2007. Overall, evacuations had displaced a relatively small number of Canadians. In more than 25 years, wildfires destroyed 497 homes and prompted evacuation of just 210,000 people, the equivalent of about 18 homes and 7,500 evacuees annually. We confirmed only one civilian fatality.

That compares with roughly 3,000 homes lost in Slave Lake in 2011 and Fort McMurray in 2016. Fort McMurray also had 80,000 evacuees in 2016 and B.C. had 65,000 evacuees the following year. In Alberta, 15,000 were evacuated during the spring of 2019 alone.

Analysis of national fire numbers and area burned have revealed statistically significant increasing trends in large parts of Western Canada. Nationally, the largest fires have doubled in size since 1959. We also know that fire seasons are getting longer, with a larger number of days being conducive to the types of fast-spreading, intense fires that can threaten public safety and property.

In recent decades, there has been a surge of research studies that seek to predict how fire regimes — fire frequency, size, intensity, severity and season — can be expected to change in concert with our heating climate. Those studies certainly point to intensification of the kinds of weather extremes that produce wildfire disasters like the recent one in Lytton.

Possible catastrophes need action

Prediction has long been a cornerstone of fire research and fire management. We study the data and build complex models to identify which areas are most likely to burn today, tomorrow, this year and in the years to come. This information can help decision makers prioritize limited fire suppression resources and mitigation budgets, such as those allocated for FireSmart fuel reduction treatments.

Early in my career, I used complex computer simulation models to try to map the locations most likely to burn in the next or several years. But when we looked at where the real fires occurred in the years that followed, we discovered that most fires consumed areas assessed as having a relatively low likelihood of burning.

No matter how sophisticated, fire risk assessments are riddled with uncertainties and crippled by the inherent variability and the random nature — referred to as stochasticity — that accompany fires, weather and fuel at play.

Governments can prioritize the most at-risk communities in a region and allocate mitigation funds to the top 20, but the next disaster could very well hit community No. 21. When conditions are extreme, like the 60 km/h winds reported in Lytton, FireSmart fuel reduction treatments cannot be relied upon to protect a community from an encroaching fire.

The evacuation records taught us that these events often unfold under highly atypical conditions such as extreme wind speeds that would be ignored in risk assessments based on what is most likely. In short, if it’s possible for an area to burn at all, then you need to plan for it.

Take what you know and plan what you’ll do

So what do we know for certain? Fuels are the hazard or precondition necessary for fire, and we know where the fuels are. In this context, fuels are live and dead biomass or vegetation. We can map the fuel hazard and identify which locations of a community or landscape is exposed to potential ignitions.

This simple approach led to the creation of the FireSmart Exposure Assessment tool for informing community protection planning, and we’ve recently shown that it works for assessing large landscapes too.

My research team is currently extending that work to map potential fire pathways into communities, and in collaboration with transportation engineer Amy Kim and her students, we’re asking how the flow of fire into a community could disrupt the flow of people evacuating the area.

Our aim is to develop simple and easily computed metrics of fire exposure, fire pathways and evacuation routes to inform what-if scenarios. Agencies and communities can use these to understand vulnerabilities and develop proactive strategies for mitigation, response, containment and evacuation.

Science can inform the planning process, but ultimately these efforts will only succeed when solutions are developed locally to capture local circumstances, knowledge and needs. Rather than a burden, planning for fire can be a mechanism for growing local skills and long lasting community connections, by bringing diverse perspectives together around the common goal of a safer and more resilient future.

When it comes to wildfire threats to communities, we are navigating uncharted waters. Under extreme conditions like those across B.C. this summer, we cannot stop a spreading wildfire. When they occur, the only option is to contain it or evacuate. So start planning your route now. The Conversation

Jen Beverly, Assistant Professor, Wildland Fire, University of Alberta

This article is republished from The Conversation under a Creative Commons license. Read the original article.


Climate change and extreme weather

Increasingly, the climate change debate is part of the dialogue surrounding catastrophic weather events, including floods, fires and severe weather impacting Canada. Emerging research is focusing on how climate change may be impacting extreme weather events, and the insurance industry is grappling with how to respond to a changing landscape of risk. Set on a backdrop of normal climate variability through natural cycles, the impacts are also complicated by the complexity of the interaction between the ocean and atmosphere, and the arctic and mid-latitudes.  These teleconnections make it difficult to separate the signal from the noise.  The biggest question of all is whether the future will resemble the past.

An interesting starting point for all weather is in the arctic, and scientists are beginning to focus on the connections between the arctic and weather in the middle latitudes, especially in North America.  The term “arctic amplification” refers to the idea that the arctic is warming faster than anywhere else on the earth, and that the current warming accelerates future warming in a positive feedback loop.  Scientists often cite the shrinking extent of arctic sea ice as the primary driver of the warming.  As sea ice extent shrinks, the warm ocean surface is exposed to air, which has several implications.  A white ice surface reflects incoming solar radiation back to space, keeping the surface colder.  Conversely, a darker ocean surface absorbs incoming solar radiation, warming the ocean surface and also warming the lower levels of the atmosphere.  A warmer atmosphere can hold more water vapor, which is an important greenhouse gas.  So as arctic sea ice extent shrinks, the arctic atmosphere continues to warm more rapidly through these complimentary processes.

Figure 1:  Arctic (land stations north of 60°N) and global mean annual land surface air temperature (SAT) anomalies (in °C) for the period 1900-2015 relative to the 1981-2010 mean value (CRUTEM4 dataset)

Observations certainly support the theory that arctic amplification is leading to greater warming in the arctic than the warming being observed in the global average.  There has been an ~3 degree Celsius warming in the arctic since the beginning of the 20th century.  Seasonal anomalies have been quite profound.  Arctic sea ice extent reaches an annual minimum at the end of September each year.  At that point, the on-set of the arctic night triggers a freezing up of the sea.  However, for the past decade or so, this freezing up has been delayed due to anomalously warm temperatures in the late fall and early winter.  In November of 2016, temperatures were 20 degrees Celsius warmer than the daily average, and this coincided with a record low coverage of arctic sea ice.

Figure 2:  Daily temperature anomalies on Nov 17, 2016 when record high temperatures coincided with record low sea ice extent (

Research indicates that another positive feedback cycle is emerging; namely that when the Arctic is very warm, it is leading to the jet stream taking wavier paths—big northward swings and southward dips. The jet stream is driven by temperature differences between the poles and the tropics.  Scientists have observed that the reduced temperature difference between the North Pole and tropics is associated with slower west-to-east jet stream movement and a greater north-south dip in its path. The poleward branch of this pronounced jet stream transports heat and moisture from deeper in the tropics north into the Arctic, which heats the Arctic more.

Figure 3:  Schematic of typical 1980’s sea ice extent and jet stream pattern (purple), contrasting with current artic sea ice extent (white) and the jet stream pattern of today (reds and blues) (SSEC at University of Wisconsin)

The real question is: how does this warming of the arctic affect weather in the middle latitudes, including North America?  The answer may lie in the physics and dynamics of a slower and wavier jet stream. This pattern causes a “stickiness” in the typical progression of storms.  Rather than moving along quickly, systems are beginning to stall and intensify, resulting in more extreme weather, including floods, large hail, and severe drought accelerating wildfires.

This type of ‘sticky’ jet stream pattern was in place during the on-set of the Fort McMurray wildfire.  A dome of persistent high pressure was in place over Northern Alberta for several weeks preceding the event, causing abnormally warm temperatures and a lack of precipitation that directly contributed to ample fuel for a large and damaging wildfire.  On the flip side of the coin, when stuck in the rainy part of the jet stream, extreme precipitation is possible. A study completed at the Prairie Climate Center in 2017 projected spring precipitation in the mid-century to be 40-50% higher than the baseline in some locations in Southern Alberta and Saskatchewan, resulting in the potential for massive flash flooding along the many rivers of the Prairie Provinces.

While observations and theory suggest that arctic amplification is already occurring and is going to be difficult to reverse, the actual impacts to communities in the northern hemisphere are still greatly uncertain.  As outlined above, there is an active research community endeavoring to shed light on the various potential impacts, including extreme weather that has the potential to change the landscape of risk in Canada.  Our existing tools rely heavily on the idea that the future will resemble the past; it is, therefore, imperative that the insurance industry continue to develop innovative approaches to managing risk, and partner with the academic community to better understand the problem and customize solutions.


Future hail and severe weather environment

Three model pairings (HadCM3-MM5; HadCM3-HRM3 and CCSM3-MM5) from the North American Regional Climate Change Assessment Program (NARCCAP) (Mearns et al. 2012) were used to assess future (2041-2070) hail and severe weather climate west of the continental divide based on the SRES A2 emission scenario. In addition, for the first time, a hail model (HAILCAST; Brimelow et al. 2002) was run using the NARCCAP models to explicitly project future changes in hail characteristics.

According to Brimelow et al. (2017), by midcentury, we may see an overall decrease in the number of severe weather days in southern Saskatchewan and Manitoba in the summer (with no change in the spring), though when it does occur, it could potentially be more severe (with respect to wind and tornadoes). However, changes in hail are not obvious due to the increasing height of freezing levels that tend to melt hail more readily.

The decrease in future severe weather days in southern Saskatchewan and Manitoba may be due to increased capping (i.e. warm air above the ground that inhibits storm formation). Much of Alberta may experience an increase in damaging hail in the summer, when storms do occur.

There does not appear to be much of a future change in severe hail days over southern Ontario, however, when hail does occur, it may potentially be larger (in spring, not summer) partially due to higher freezing levels, so as to melt smaller hail before it reaches the ground. In addition to larger hail, southern Ontario may also experience an earlier occurrence of large hail in the spring period.

Over all regions, the common ingredient that creates conditions for more intense storms is an overall increase in atmospheric moisture, caused by increased future warming that increases storm energy (when storms do occur). These results are broadly consistent with other U.S. research (e.g. Allen et al. 2015; Trapp et al. 2009; Van Klooster and Roebber, 2009), although future changes in the summer jet stream affecting southern Canada may not substantially decrease like the U.S. This is partially why parts of the southern Canadian Prairies may not see decreased severe weather potential in summer.


Allen, J. T., Tippett, M. K. & Sobel, A. H., 2015:  An empirical model relating US monthly hail occurrence to large-scale meteorological environment. J. Adv. Model. Earth Syst. 7, 226–243.

Brimelow, J. C., Reuter, G. W. & Poolman, E. R., 2002: Modeling maximum hail size in Alberta thunderstorms. Weath. Forecast. 17, 1048–1062.

Brimelow, J.C., W.R. Burrows and J.M. Hanesiak, 2017: The changing hail threat over North America in response to anthropogenic climate change. Nat. Clim. Change, DOI: 10.1038/NCLIMATE3321.

Mearns, L. O. et al. 2012: The North American regional climate change assessment program: overview of phase I results. Bull. Am. Meteorol. Soc. 93, 1337–1362.

Trapp, R. J., Diffenbaugh, N. S. & Gluhovsky, 2009: A. Transient response of severe thunderstorm forcing to elevated greenhouse gas concentrations. Geophys. Res. Lett. 36, L01703.

Van Klooster, S. L. & Roebber, P. J., 2009: Surface-based convective potential in the contiguous United States in a business-as-usual future climate. J. Clim. 22, 3317–3330.


Author: Glenn McGillivray, Managing Director, Institute for Catastrophic Loss Reduction (ICLR)

On a recent long haul flight I finally broke down and watched ‘Only the Brave’, the 2017 Josh Brolin movie about the 19 wildland firefighters killed at Yarnell Hill, Arizona in June, 2013.

Up to that point, I had refused to watch the movie, thinking that it would likely romanticize wildland firefighting and demonize wildland fire.

I refused to watch the movie like I refuse to call the Fort McMurray wildfire ‘The Beast’, an overly romantic moniker coined by the now retired fire chief of that city who gave the fire the qualities of an evil, soulless creature. I didn’t (and still don’t) see the benefits of animorphizing the fire, making it seem like a rational, calculating, punitive creature. In my view, it helps no one to imply that such a fire is some kind of intentional being with a mind of its own. We won’t work to prevent such an event from reoccurring with such a mindset.

I remain dedicated to not calling the Fort McMurray fire that name, though I admit I was largely wrong about the movie. It is a pretty good flick, though there is one part where the fire superintendent (played by Brolin) looks over the expanse of scrub in his protection zone and says something to the effect that he and his crew “protect all of this.’

The idea of ‘protecting’ a forest against fire is largely the wrong stance to take (especially in Canada’s boreal forest, which needs fire for its own good). It is this ‘suppression at all costs’ mentality that has gotten many North American jurisdictions into the mess they are currently in, i.e. where years of successful suppression has ensured that wildlands are choked with fuel that’s now just waiting to go up like a tinder. In large measure, saying we need to stop fire on the landscape is akin to saying we have to stop the wind or the rain.

But I don’t wish to spend my time here talking about the issue of suppression. I deal with that here.

Instead, I want to put forth an idea of how we can better understand the interface fire problem (i.e. the issue of fire getting into communities), at least partly by looking at what we’ve learned from the past.

In the distant past, several major cities, mostly in Western Europe and North America, have experienced large conflagrations caused by one thing or another (like rambunctious cows). Fires in such places as London, New York, Toronto, Chicago and San Francisco lead to many changes in how cities are designed, how buildings are constructed, and in fire education and safety.

I suspect that these fires were largely viewed in technical terms and, thus, were seen as addressable, where measures could be put into place to prevent or, at the very least, reduce the risk of reoccurrences.

Firewalls were placed within and between buildings; openings (like small windows) were limited on the exposure sides of buildings; fire doors became common; buildings were outfitted with fire alarms, suppression equipment with dedicated water supplies and, from the late 19th century, sprinkler systems; less wood was used in construction; open flames were limited, and so on. Parallel to these efforts came the rise of education programs to inform people about the risk of fire and actions they could take to limit ignitions and spread. Over time, both the frequency and severity of urban fires dropped precipitously, to the point where fires are no longer a major cause of death and the main cause of insured property damage in most industrialized countries.

These actions are essentially early examples of risk management and are largely still in practice today. Indeed, it is still common for the risk manager of, say, a factory or mill to do a walk around of a site and make recommendations about how to prevent ignition and spread of fire.

But we don’t take this approach with homes in the interface. Why?

First, wildfires are viewed as ‘natural disasters’, and there is a widespread view that “nothing can be done about natural disasters” – they occur at the whim of Mother Nature. Really, though, a wildfire is a natural hazard, the disaster comes when the hazard exploits manmade vulnerabilities. I think the view that losses are inevitable when a hazard strikes is leading to inaction when it comes to wildland fire. For some reason, we treat the prevention of interface fires differently than we treat the prevention of other fires. But fire is fire.

Second, people have a misconception about wildfires and the interface, believing that wildland fires roll through the forest, hit a built up area and keep rolling. But what largely happens is that embers from the wildfire are blown far ahead of the fire front and ignite flammable materials located around structures. These materials then either ignite the structure directly, or ignite something else (like a wood shed or deck) that in turn ignites the structure. This is what largely occurred in Fort McMurray. It is also what occurred in the Tubbs Fire in Northern California in October 2017, except the embers travelled very deeply into the urban core of Santa Clara, leading to the incineration of about 2,800 homes, mostly in the Low Risk part of town (as designated by the city’s statutory state wildfire risk maps). These maps apparently did not take the state’s often intense Santa Ana winds into consideration.

Once you realize that wildfires are not juggernauts that roll through town like a steamroller and that structural ignitions from wildfire embers are preventable, then you can put programs into place to address the issue of flammability of individual structures, subdivisions and entire communities located in the interface.

One problem I see is that we may be talking too much to the wrong folks; to wildland fire experts and not to structural fire experts, fire modellers and other urban fire experts.

Now don’t get me wrong. Wildland fire experts, including fire ecologists and wildland fire suppression experts, are key throughout the entire lifecycle of a wildland fire – (long) before, during and (long) after. And we need to recognize that the condition and health of the forest around the interface community will largely dictate how intense the fire will be, the rate at which it spreads, and the amount of embers that are produced (the greater the fine fuels, the more embers).

But once a wildland fire gets into town, the fire stops being a forest fire and starts a new life as an urban fire, possibly becoming an urban conflagration or ‘firestorm’ if enough structures are ignited (often via structure to structure spread of fire).

So we have to recognize that once the fire hits town, it becomes a different fire, feeding on different fuels (like structures and vehicles). A fire ecologist, for example, has no expertise in the mechanisms that lead to structural ignition and spread of fire in an urban setting.

Thus, we need to bring structural or urban fire departments and experts into the discussion and leverage their knowledge (of course, many are already involved in the discussion, but many are not).

We have to pull in such organizations as the Canadian Association of Fire Chiefs, the Aboriginal Firefighters Association of Canada and their provincial counterparts, as well as provincial firefighter associations.

We need to bring in such researchers as fire modellers, to better understand how fire grabs hold and spreads in urban areas (we know what causes structures to ignite, but need to do more to understand how entire subdivisions are lost) and the sequence of such spread. Some work has already been done in the fire following earthquake research area, and much of the learnings there can be carried over to wildland urban interface fire research.

Essentially, we need to take the same approach with wildland fire in interface communities as we do with all other urban fires, including urban conflagrations.

This can only start by talking to the right people.


Article written for the Institute for Catastrophic Loss Reduction’s (ICLR) Cat Tales, January/February 2018:

On February 15, ICLR released Hail climatology for Canada: An update. The report was written by David Etkin, Associate Professor of Disaster Management at York University.

The paper serves as an update to Etkin’s Canada’s Hail Climatology: 1977-1993, prepared for ICLR in April 2001. The update is based on an objective analysis of hail observation station data from 1977 to 2007.

National hail climatologies (i.e. the number of hail days per year in Canada) serve as a foundation for hail risk analyses. Although national hail climatologies cannot be used to determine hailstorm severity or to infer damage, they are used to help identify vulnerable regions, and thus areas where mitigation efforts should be concentrated.

Hail days data for the analysis was obtained from the Digital Archive of Canadian Climatological Data, Environment Canada from all hail observing stations in the country. For each station, monthly days-with-hail were calculated where the number of missing observations were less than four days in any month. This represents 96.7% of the records. Monthly hail days were adjusted for missing data by multiplying the unadjusted hail-day observation by the factor [1+ (number of missing days) ÷ (number of days in the month)].

A trend analysis showed no change in hail frequency for Ontario, in contrast to other studies that have examined severe hail frequency and tornado frequency. Alberta, by contrast, showed a significant increase in hail frequency during the period 1977 to 2007.

Manitoba and Saskatchewan showed decreasing trends. Future research could examine in more detail which areas exhibit increasing or decreasing hail frequencies, and how those seasons correlate with larger scale climate drivers.

Etkin warns that further hail research would be constrained by the lack of ongoing hail observations by Environment Canada. Hail observations at Environment Canada weather and climate stations were not widespread until 1977, he notes. After 1993 the number of hail observing stations began to decline and after 2005 the number of stations reporting hail dropped precipitously. After 2007, he reports, the number of observation stations was trivial. Other datasets would have to be used, such as those created by radar and satellite imagery.

In the 1990s and early 2000s, ICLR conducted a number of studies focused on understanding the risk of hail damage in Canada. The hail research needs of insurance companies was acute before ICLR was established when Canada’s most costly hailstorm struck Calgary in 1991. In particular, ICLR published an earlier hail climatology (1977-1993) and conducted several workshops where hail was considered as part of a broader discussion of convective storm-related losses.

Institute members also contributed to an industry discussion that lead to the creation of the Alberta Severe Weather Management Society.

Fortunately, there were few large hail damage events in Canada between 1991 and 2008. Indeed, there was a period of almost ten years when the Institute received virtually no requests from member companies to study the peril. The industry directed ICLR to focus its research on other hazards, including the alarming increase in water damage. Indeed, hail research was not included in the Institute’s last five-year plan.

However, hail damage claims have ramped up in Canada in recent years. Just three wind/water/hail events in Alberta (2010, 2012 and 2014) totaled more than $1.66 billion in insured losses. As a result, in 2015 Canadian property and casualty insurers – through ICLR’s Insurance Advisory Committee – formally asked the Institute to investigate the peril and suggest actions insurers can take to mitigate future hail losses in the country.

Conducting an updated climatology of hail is key to understanding the current state-of-play for the hazard before more in-depth research is pursued.

Prior to joining York University, David Etkin worked for 28 years with the Meteorological Service of Canada in a variety of fields, including operations and research. He has been an associate member of the School of Graduate Studies at the University of Toronto since 1994, doing research on natural hazards, teaching and supervising graduate students. In 2003 he was awarded the Environment Canada Award of Excellence. Prof. Etkin has participated in three international hazard projects and was one of only two non-Americans to assist with the U.S. 2nd national assessment of natural hazards. He has been principal investigator for a NATO short term project on natural hazards and disasters and the Canadian Assessment of Natural Hazards Project that resulted in the book An Assessment of Natural Hazards and Disasters in Canada, which he edited. The summary report he wrote of this latter project has been widely distributed within Canada and was used by Public Safety Canada and Foreign Affairs Canada as the official Canadian contribution to the recent ISDR Kobe disaster conference. CT

Link to report:

The Hurdle of Replacement Cost Value (RCV) to Building Back Better

The Hurdle of Replacement Cost Value (RCV) to Building Back Better

  • Emily Stock, lawyer, Monaghan Reain Lui Taylor LLP

When we discuss the concept of building back better, we all agree that it is great.  Who can oppose making our communities, infrastructure and people more resilient to catastrophes.  We also recognize that there are a multitude of hurdles to consider, and that underlying many of those hurdles is an often inflexible legal regime.

Understanding property insurance coverages is significant to any policy for building back better.  Typically property insurance policies provide for some combination of Actual Cash Value and Replacement Cost Value, depending on a variety of criteria.

Actual Cash Value (ACV) is also sometimes referred to as market value.  It is intended to be the dollar amount you could expect to receive for the item if you sold it in the marketplace.  It thus takes into account depreciation of the property.  An insurance company determines the depreciation based on a combination of objective criteria (a formula considering the category and age) and a subjective assessment in the marketplace.  The result is that if a homeowner receives ACV they technically receive exactly what they lost (i.e. the value of an old house); however they cannot afford to replace their property (i.e. build a new house).

Replacement Cost Value (RCV) is the cost to replace the property.  It insures the depreciation of the property, so that the homeowner receives the cost to build a new house similar to the house they lost.  When we talk about building back better, the homeowner can typically only afford to rebuild if they are able to obtain RCV.

The difficulty in obtaining RCV is that it is typically only available where the rebuild is:

(a)        at the same site or location; and,

(b)        uses materials of “like kind and quality.”

But what if being at the same location, or building the like kind and quality, is not building back better?  Consider if the insured event is a flood in an area that is now prone to flood zones.  We don’t want to encourage that homeowner to rebuild in the same location, despite there being significant financial incentive for them to do so.  Similarly if the event is a fire, we know that we don’t want to require that the rebuild use the same non-fire resistance materials (i.e. siding, roofing materials), or even to rebuild the same type of structure which may have been inappropriate for the location recognizing the changing environment.

Although we recognize that the wording of the policy must define the rights of the homeowner, and the monetary obligation of the insurer, I expect we would also agree that it seems unfair to require the homeowner to rebuild at the same site, or use materials of like kind and quality, where such is contrary to the principles of building back better.

In considering this conundrum, it is helpful to consider the rationale for RCV, as recently articulated by the Ontario Court of Appeal in Carter v. Intact.  After recognizing that replacement cost insurance is justifiable even though it provides the policyholder with greater value than what they lost, the Court explained the following as to why the limitations on RCV are reasonable and required:

… allowing insureds to replace old with new raises a concern for the insurance industry. The concern is moral hazard: the possibility that insureds will intentionally destroy their property in order to profit from their insurance; or the possibility that insureds will be careless about preventing insured losses because they will be better off financially after a loss.

To put a brake on moral hazard, insurers will typically only offer replacement cost coverage if insureds actually repair or replace their damaged or destroyed property. If they do not, they will receive only the actual cash value of their insured property.

It is clear from this reasoning that there is little to no risk of the moral hazard in catastrophic insured events.  There is similarly no risk of the moral hazard where an insured homeowner does not want to comply with one or both of the criteria to receive RCV, same location and/or like kind and quality, because of a desire to build back better.

As an advocate for encouraging the insurance industry to build back better, we therefore advocate thoughtfully reducing the two criteria for building back better.  This could be done proactively by deleting the same location requirement in policies in flood or other risk zones.  The like kind and quality criteria should also be carefully examined for certain types of insured events (i.e. fires), so that more appropriate materials can be agreed upon before the event as acceptable under the policy.

An important step to reducing these types of hurdles is to continue and develop the conversation, which I look forward to doing at the CATIQ Canadian Catastrophe Conference in 2018, and beyond.


Improving Wildfire and Flood Risk Mitigation in Canada

Author: Alan Frith, CPCU, ARe, CEEM

Canadians have suffered an increasing number of natural and man-made disasters that have devastated communities and cost insurers more than CAD 5 billion in 2016 alone. With decentralized regulation and prevention efforts, the growing financial fallout is only likely to worsen. CatIQ’s Canadian Catastrophe Conference, January 31 – February 3, 2018, will bring together industry, academia, and government to discuss Canada’s natural and man-made catastrophes. I will be sitting on a panel discussing the viability of the Alberta property insurance market in light of recent catastrophic events.

For insurers, an accurate and objective view of risk, reinforced by an up-to-date insight into the exposure, is crucial to maintaining financial stability. Managing catastrophe risk through historical losses alone is unreliable and volatile. Organizations must seek out the most comprehensive information available to understand how rapidly changing environmental characteristics alter their risk profiles.

Let’s examine two of the perils that have recently caused significant losses in Canada to understand how risk assessment tools can be used to quantify the impact of these kinds of events.


High-resolution satellite imagery enables us to develop an in-depth understanding of land use/land cover and is a critical piece of the risk mitigation framework. Using this imagery, risk modelers can construct an accurate map of potential fuel sources, a fundamental requirement for effectively modeling wildfire spread. Dense or dry vegetation, for instance, allows a fire to spread easily, while roads, rivers, or even mountains serve as natural firebreaks. Depending on the local distribution of fuel sources, the interaction between different types of fuels, and the gradual evolution of the surrounding landscape, your portfolio’s exposure may very well outpace your organization’s risk appetite over time.

It’s also important to monitor the effect of population movement. Rapid residential and commercial/industrial growth deeper into areas of combustible fuel, the Wildland Urban Interface (WUI),  exposes properties to greater wildfire risk. For organizations underwriting property risk in Canada, one fire in recent memory certainly stands out from the rest, demonstrating well the dangers of increased development in the WUI.

In 2016, the Fort McMurray fire blazed through the surrounding areas of northeastern Alberta, causing unprecedented devastation and destroying up to 80% of homes in some neighborhoods. A hub of oil production in Canada, the town had experienced a 30% population boom between 2006 and 2011 as people moved to the area to take advantage of job opportunities at the many mines and oil sands refineries in the area. The increased presence of residential structures among dense forest helped the flames spread quickly and resulted in insured losses of nearly CAD 4 billion – making it the costliest natural disaster in Canadian history. Although the oil facilities and pipelines themselves avoided damage, oil companies suffered significant business interruption losses as firefighters struggled for months to contain and extinguish the fires.


Detailed satellite imagery, coupled with high-resolution elevation data in the form of digital terrain maps (DTMs), is also used to build floodplain maps, which play an important role in evaluating flood risk. Reliable model output for this peril is highly dependent on precise exposure locations, as small changes to a location’s elevation or proximity to floodplains can have a significant impact on potential losses. The development of risk mitigation strategies, such as strict building codes and zoning laws, can help prevent these losses, so long as these codes are followed and the laws are enforced. Government agencies must consider the effect that unchecked commercial development—and the associated infrastructure of roads, sidewalks, and parking lots—can have on the larger ecosystem, especially during a natural disaster. The consequences of unregulated urban sprawl were recently seen in Houston, Texas, during Hurricane Harvey: Severe flooding was caused in part by floodwater that had inadequate access to natural drainage, such as undeveloped prairie and marshland areas.

Although flood is by far the most frequent natural disaster in Canada, preventive efforts are largely decentralized. For residential structures, overland flood damage has only recently been included as a covered peril by private insurance. In many cases, it’s a common exclusion from homeowners’ policies, with catastrophic losses ultimately falling in taxpayers’ laps via government emergency funds. With multiple catastrophic flood events in the past seven years, each causing billions of dollars of losses, it’s imperative to mitigate risk through preventive measures. This includes building flood defense systems as well as encouraging homeowners in and around potential flood zones to purchase policies that protect them against flood loss.

What’s Next?

Governments, insurers, and homeowners need to work together to reduce losses caused by natural disasters. It’s encouraging to see mitigation efforts gaining momentum, but there is much more work to be done. While government can manage building codes and zoning laws, it’s also up to insurers to carefully evaluate their risk profiles and exposures to ensure that they continue to maintain adequate reserves in the event of a catastrophe.

A useful first step in risk assessment is a detailed hazard map. These maps indicate the associated risks for perils such as wildfire and inland flood for a range of return periods, such as 100, 250, and 500 years. The logical next step to advance your risk evaluation strategy is to use a probabilistic model.  These models use advanced simulation methods to provide valuable metrics such as Average Annual Loss (AAL), Tail Value at Risk (TVaR) and a full Exceedance Probability (EP) Curve, from portfolio level down to individual exposures.

The Geospatial Analytics Module in AIR’s Touchstone® platform lets you seamlessly integrate exposure information with location-specific hazard maps, including wildfire fuel layers and flood inundation footprints for Canada. In addition, AIR offers probabilistic models for earthquake, crop hail, severe thunderstorm, winter storm, and tropical cyclone in Canada; models for wildfire and flood are being developed. As the insurance industry in Canada continues to cope with recent losses, the development of risk maps and probabilistic models as well as a deeper understanding of underlying hazards can help provide more effective risk management.

AIR models give you the information you need to support your entire risk-decision chain. Learn about the AIR Model Advantage!

About AIR Worldwide

AIR Worldwide (AIR) provides risk modeling solutions that make individuals, businesses, and society more resilient to extreme events. In 1987, AIR Worldwide founded the catastrophe modeling industry and today models the risk from natural catastrophes, terrorism, pandemics, casualty catastrophes, and cyber attacks, globally. Insurance, reinsurance, financial, corporate, and government clients rely on AIR’s advanced science, software, and consulting services for catastrophe risk management, insurance-linked securities, site-specific engineering analyses, and agricultural risk management. AIR Worldwide, a Verisk (Nasdaq:VRSK) business, is headquartered in Boston with additional offices in North America, Europe, and Asia. For more information, please visit