Fire scientists at Los Alamos National Laboratory (LANL) and their partners have developed models that can help firefighters understand fire conditions, behaviors, and likely outcomes in terms of both risk and hazard in real time.
These models are extremely complex, as they must amalgamate statistical, empirical, and physical information from numerous sources in order to provide accurate or close-to-accurate simulations of diverse types of wildfire in diverse scenarios and under a range of terrain and weather conditions. Understanding and predicting wildfire behavior is a particularly difficult scientific problem, as the phenomena range from flame sheets to topographically-influenced atmospheric dynamics. Wildfires are driven by complex processes, including the combustion of natural fuel sources to local meteorology, and their behavior depends heavily on the coupling between these and other variables.
We are working with these scientists to develop better visualization methods for LANL’s FIRETEC model simulations. These visualization methods focus on allowing scientists to see the many, intersecting and interdependent variables overlapping and occupying the same digital, three-dimensional space, and how they interact overtime.
Such variables include both volumes and surfaces, such as smoke, vegetation density, air humidity, soil moisture content, tree height, air temperature, fire temperature, and wind direction and velocity. We have experimented below with using “natural color,” or colors drawn from the specific environments susceptible to high wildfire hazard and risk, to improve these visualizations, as well as various levels of transparency and texture.
Future explorations include an increased focus on the potential benefits of physical-digital integrations through augmented reality.