Proposed Session NH038: “Wildfires: Pushing the Limits of Predictability with Big Data, Big Models, and Machine Learning”
Societal and economic risks of wildfires have become more concerning in the recent years, especially in densely populated areas and areas with industrial operations. For example, the 2017 California wildfire season costed billions of dollars and caused 43 deaths; the 2016 Fort McMurray wildfires scaled down Canadian oil production for two months affecting global oil.
Wildfires remain one the least predictable perils due to both their aleatory uncertainty component mainly associated with wildfire ignition triggers, and their epistemic uncertainty component reflecting the lack of knowledge about fire fuel availability, physical setting, and weather. Over the past decades, many novel data sources, models, and predictive techniques became available, including satellite imaginary, reanalysis datasets, weather forecasting models, social media data and artificial intelligence methods. This session invites contributions demonstrating the use of “big data”, “big models”, and “machine learning” to improve uncertainties in forecasting wildfire occurrence, severity, temporal and spatial patterns