Wildfires

How can SAR detect vegetation changes throughout the wildfire cycle?

Verdugo Mountains, Los Angeles

In this Story Map we will show how Synthetic Aperture Radar (SAR) images can be used to detect loss of vegetation during a wildfire and recovery after a fire. The capabilities presented here are focused on L-band radars such as NISAR (NASA-ISRO SAR Mission).

Background

Wildfires play a key role for many ecosystem services. Local communities have experienced firsthand the impacts of these disasters, from the loss of lives and properties to disrupted ecosystem services. Most deaths and damage to infrastructure occur in the wildland-urban interface, and policy response usually involves increased fire suppression activities in those areas. As a result, decades of fire suppression in the western U.S. have probably altered prehistoric wildfire cycles. Knowledge of the local natural fire regime not only helps to protect local populations, but also offers insight for management of these areas, such as prescribed burning and fuel treatments (See NISAR White Paper).

NASA Mobilizes to Aid California Wildfire Response

How to map fire extent and severity with SAR?

By convention, SAR images are displayed in grayscale with bright areas indicating stronger detected signal, or backscatter. In general, radar brightness is highest over tall vegetation, while short vegetation, bare ground, and open water appear dark. Burned areas will appear darker post-fire with the loss of vegetation.

The example below displays HV images before and after the 2019 Rainbow Fire in Delta Junction, Alaska. The radar backscatter noticeably decreases within the fire scar in the post-fire image. Interestingly, some areas outside the fire perimeter also show a decrease in radar backscatter. Although the pre-fire image (August 2018) and post-fire (August 2020) were acquired at the same time of year, there may be differences in ground moisture or other properties that impact the backscatter between those two years.

SAR's HV (cross-polarized) images correlate with biomass and canopy volume as they are primarily sensitive to volume scattering associated with vegetation. Volume scattering decreases as vegetation is replaced with bare or sparsely vegetated ground after a fire, and this change in scattering can be used to map burned area extent and severity.

Radar backscatter in the HV polarization

2019 Rainbow Fire near Delta Junction, Alaska. HV grayscale pre-fire (left) and HV grayscale post-fire (right). The fire perimeter in red was retrieved from the National Interagency Fire Center (NFIC) and was derived from optical imagery.

The image on the left shows the area's mix of black spruce forest and open meadows. The image on the right shows burned black spruce forest post-fire, with a mix of standing or fallen trees, shorter woody shrubs, bare ground, and loss of vegetation canopy. Source:  https://akfireinfo.com/ 

Case Study

La Tuna Fire (2017)

The examples below demonstrate radar’s sensitivity to changes in vegetation cover over the entire wildfire cycle from the 2017 La Tuna Fire, which burned 29 square kilometers in the Verdugo Mountains and is the largest recorded fire in the city of Los Angeles. The data is from UAVSAR, NASA JPL's L-Band airborne radar and NISAR testbed instrument.

Pre-fire Fuel Load & Post-fire Vegetation Loss

The HV backscatter images below were acquired before and after the September 2017 La Tuna Fire in the Verdugo Mountains, Los Angeles, California. The optical-derived burned area is delineated in black and retrieved from the National Interagency Fire Center (NIFC). In the post-fire image, notice the decrease in HV backscatter values in the fire perimeter as well as some areas of disagreement. HV's sensitivity to change in vegetation cover can be leveraged to improve existing fire extent maps.

The two images of the Verdugo Mountains further below illustrate how the post-fire vegetation differs between the burned (red dot) and not burned (green dot) regions.

Before the fire, higher HV values can be related to a higher fuel load or the amount of vegetation available to burn. This information can aid decision-makers in determining fire risk and plan prescribed burns. A simple HV ratio, or difference from before and after the fire, can also help to improve upon published fire perimeters, which are usually generated manually by fire agencies.

HV image pre-fire (left, Oct 23, 2014) and post fire (right, Nov 02, 2017) clipped to the extent of the Verdugo Mountains in Los Angeles, California.

Verdugo Mountains burned (left) and not burned (right). Image Credit: Naiara Pinto

SAR vs. Optical Burn Severity

Radar-derived estimates of fuel load and burn severity are not widely used by fire agencies as this is an emerging topic of research. Optical imagery is a more familiar technology to fire managers and has been more commonly used for decision making.

A simple ratio of HV images pre and post-fire can show temporal change. Here, we calculate the HV Ratio (Pre-Fire HV/Post-Fire HV) and compare it with Landsat's dNBR (differenced normalized burn ratio), which is a widely used index for fire-associated change detection based on optical sensors. dNBR is provided for comparison with the HV ratio, as it captures burnt area by exploiting the loss of healthy green vegetation, or changes in greenness. Notably, radar is not sensitive to greenness but instead can detect shape and moisture properties, and collect data when optical sensors cannot see through the smoke cover.

Radar HV Ratio (left) and optical dNBR (right). The stripes in the optical dNBR are Landsat 7 data artifacts.

Temporal Fuel Load Monitoring

UAVSAR has collected data over the Verdugo Mountains since 2009. We can use these radar images to inspect changes over time in the HV backscatter inside and outside the fire perimeter. The time series plot below shows the mean and spread of HV values in decibels (low dB denotes low backscatter) from 2009 to 2020.

The 2017 image was collected shortly after the fire. Notice how the 2017 mean HV decreases inside the fire scar. This is due to the loss of vegetation cover and increase in bare ground in the burned areas. The chart shows how both inside and outside the fire perimeter have lower HV means in 2014 (pre-fire). This is likely due to the severe drought in California during that time, which would have an impact on the vegetation cover in the mountains.

Mosaic of Fire Data

The maps below are mosaics of the radar observations across a decade in the mountains near Los Angeles. Radar signals bounce off burned, barren terrain differently than they reflect from unburned, brush-covered hillsides or from fresh growth. The colors indicate the relative amount of vegetation observed by different UAVSAR flights at different times.

The image below illustrates how these maps were assembled. Radar data were collected during UAVSAR flights in 2010, 2017, and 2020 over Angeles National Forest and other areas northeast of the greater Los Angeles metropolitan area.

Credit: NASA Earth Observatory

Overall, the colors are telling us that the Angeles National Forest contains a patchwork of plant communities at different stages of regeneration. For instance, areas with more red had more vegetation in 2010 than they do now. Areas with more blue and green shading had more vegetation (regrowth) in recent years. Yellow indicates areas burned in 2020 that had a higher volume of vegetation in 2010 and 2017 (red+green) but lower volume in 2020 (blue). Yellow perimeter lines on the inset below indicate the extent of several major fires:   Station    Colby    San Gabriel (SG) Complex    La Tuna  , and   Bobcat  .

Use the right and left arrows below to learn more about interpreting the multi-temporal HV composite.

Station Fire (2009)

2010=Red, 2017=Green, Blue=2020

The 2009 Station Fire appears as a mix of green/blue in this HV multi-temporal mosaic. There was high HV backscatter in 2017 and 2020 as the vegetation regenerates years after the fire.

La Tuna Fire (2017)

2010=Red, 2017=Green, Blue=2020

The 2017 La Tuna Fire appears red in this HV multi-temporal mosaic. There was high HV backscatter in 2010 (pre-fire) and low HV backscatter in 2017 and 2020 (post-fire).

Bobcat Fire (2020)

2010=Red, 2017=Green, Blue=2020

The 2020 Bobcat Fire appears yellow in this HV multi-temporal mosaic. There was high HV backscatter in 2010 and 2017 (pre-fire) and low HV in 2020 (post-fire).

Technology

Synthetic Aperture Radar (SAR): For Forest Structure

Why SAR?

Credit: NASA SAR Handbook

Synthetic Aperture Radars (SAR) use microwave radiation to transmit microwave pulses and receive backscatter echoes from the Earth's surface. This backscatter information can be transformed into high-resolution radar remote-sensing products with detailed information about surface characteristics. 

Radar can penetrate cloud cover and operate day or night, which can prove particularly useful during wildfire events. During a fire, there are often smoky conditions and persistent cloud cover. The fast-moving nature of fires also means nighttime acquisitions can provide more up-to-date fire extent information. 

Radar is highly sensitive to changes in roughness, such as vegetation structure changes. Long radar wavelengths can penetrate the canopy and interact with its components (leaves, stems, branches, trunks) before returning to the sensor. The penetration depth varies depending on the instrument's wavelength, as shown below.

Credit: NASA SAR Handbook

NISAR (NASA-ISRO SAR) is an upcoming radar instrument collecting data in the L-band wavelength (23.8 cm), as were the UAVSAR images shown in this StoryMap. The longer L-band wavelength can penetrate further into vegetation compared to other common SAR sensors measuring in C-band. NISAR will provide high-resolution, near-global, and continual mapping of Earth's natural resources and hazards, such as wildfires. (Note that the airborne radar data from UAVSAR is of higher spatial resolution than that planned for NISAR). Explore the NISAR Sample Data Product Suite, or planned NISAR data products and their product specifications, here:  https://nisar.jpl.nasa.gov/data/sample-data/ 

Radar Scattering and Polarizations

Credit: NASA SAR Handbook

Radar data is collected at different polarizations, or combinations of horizontal and vertical radar waves (i.e. HH, HV, etc.). For example, HV denotes a horizontally emitted, vertically collected wave. Each polarization is a separate data product generated by the radar, and the different polarizations have unique interactions with the ground cover. For wildfire applications, the HV channel from NISAR will be the most relevant due to stronger backscatter returns from vegetation due to volume scattering. 

The expected scattering mechanisms in a wildfire area are described below:

  • Volume Scattering: is most common in vegetated areas as the signal interacts with all components (leaves, branches, trunk, etc.) of the canopy before returning to the sensor
  • Surface Scattering: occurs when there is a smooth surface, such as bare ground and the signal has direct interaction with the soil surface. In surface scattering, the signal scatters away from the radar, and as a result, appears dark in the image.
  • Double Bounce or Volume-surface Scattering: occurs due to the interaction of the ground surface with vertical vegetation or soil to trunk scattering. As a result, most of the signal is returned to the sensor, and these areas appear very bright in the image. 
  • Rough Scattering: can occur where some, but not all, of the signal scatters away from the radar. This could be due to areas of shorter vegetation.

Credit: NASA SAR Handbook

For mountainous areas such as the Verdugo Mountains shown in this study, distortions from radar terrain shadow should be considered. This is beyond the scope of this StoryMap, but radiometric terrain corrections (RTC) can be applied to the UAVSAR products to remove unreliable pixels due to this effect (Simard et al., 2016). The general approach is to estimate the local ground area illuminated by the radar beam, and normalize the radar backscatter values by area. Since NISAR’s incidence angle range will be smaller than airborne instruments, there should be less impact on this distortion from slope.

References and Resources

Donnellan, A., Parker, J., Milliner, C., Farr, T. G., Glasscoe, M., Lou, Y., Zheng, Y., & Hawkins, B. (2018). UAVSAR and optical analysis of the Thomas Fire Scar and Montecito Debris Flows: Case Study of methods for disaster response using remote sensing products. Earth and Space Science5(7), 339–347. https://doi.org/10.1029/2018ea000398 

Ban, Y., Zhang, P., Nascetti, A., Bevington, A. R., & Wulder, M. A. (2020). Near real-time wildfire progression monitoring with Sentinel-1 SAR time series and deep learning. Scientific Reports10(1). https://doi.org/10.1038/s41598-019-56967-x 

Carlowicz, M., Stevens, J., Peacock, A., Pinto, N., & Lou, Y. (2021, February 6). A Mosaic of Fire Data. NASA Earth Observatory. Retrieved March 20, 2023, from https://earthobservatory.nasa.gov/images/147872/a-mosaic-of-fire-data 

Czuchlewski, K. R., & Weissel, J. K. (2005). Synthetic Aperture Radar (SAR)-based mapping of wildfire burn severity and recovery. Proceedings. 2005 IEEE International Geoscience and Remote Sensing Symposium, 2005. IGARSS '05.https://doi.org/10.1109/igarss.2005.1526102 

Flores-Anderson, A. I., Herndon, K. E., Thapa, R. B., & and Cherrington, E. (2019) The SAR Handbook: comprehensive methodologies for forest monitoring and biomass estimation. No. MSFC-E-DAA-TN67454. https://doi.org/10.25966/nr2c-s697

Jung, J., Yun, S.-H., Kim, D.-jin, & Lavalle, M. (2018). Damage-mapping algorithm based on coherence model using multitemporal polarimetric–interferometric SAR data. IEEE Transactions on Geoscience and Remote Sensing56(3), 1520–1532. https://doi.org/10.1109/tgrs.2017.2764748 

La Tuna fire, city’s biggest by acreage, now 80% contained, officials say. (2017, September 6). Los Angeles Times. https://www.latimes.com/local/lanow/la-me-ln-verdugo-fire-containment-20170905-story.html

NISAR: The NASA-ISRO SAR Mission. (2017). Smart Firefighting: Arming land managers with new information [White paper]. NASA. https://nisar.jpl.nasa.gov/system/documents/files/13_NISAR_Applications_Fire_Management1.pdf

Parker, J., Donnellan, A., & Glasscoe, M. (2021). Survey of transverse range fire scars in 10 years of UAVSAR polarimetry. Earth and Space Science8(5). https://doi.org/10.1029/2021ea001644 

Polarimetry | Get to Know SAR. (n.d.). NASA-ISRO SAR Mission (NISAR). https://nisar.jpl.nasa.gov/mission/get-to-know-sar/polarimetry/

Rains C. (2019) Detection of Fire Burn Scars by UAVSAR: Immediate, Short-term, and Multi-year Observations and Applications. NASA presentation.

Simard, Marc, et al. "Radiometric correction of airborne radar images over forested terrain with topography." IEEE Transactions on Geoscience and Remote Sensing 54.8 (2016): 4488-4500.

Acknowledgments

UAVSAR data courtesy NASA/JPL-Caltech

Radar backscatter in the HV polarization

2019 Rainbow Fire near Delta Junction, Alaska. HV grayscale pre-fire (left) and HV grayscale post-fire (right). The fire perimeter in red was retrieved from the National Interagency Fire Center (NFIC) and was derived from optical imagery.

HV image pre-fire (left, Oct 23, 2014) and post fire (right, Nov 02, 2017) clipped to the extent of the Verdugo Mountains in Los Angeles, California.

Radar HV Ratio (left) and optical dNBR (right). The stripes in the optical dNBR are Landsat 7 data artifacts.

Credit: NASA Earth Observatory

Credit: NASA SAR Handbook

Credit: NASA SAR Handbook

Credit: NASA SAR Handbook

Credit: NASA SAR Handbook