TY - JOUR
T1 - DLSR-FireCNet
T2 - A deep learning framework for burned area mapping based on decision level super-resolution
AU - Seydi, Seyd Teymoor
AU - Sadegh, Mojtaba
N1 - Publisher Copyright:
© 2025 The Authors
PY - 2025/1
Y1 - 2025/1
N2 - Widespread availability of Earth observing satellites offer the much-needed information to monitor global wildfire activity. Here, we propose a novel decision level super-resolution deep learning burned area mapping (BAM) model based on the MODIS surface reflectance product, which resolves the limitations associated with both coarse and medium-to-high resolution satellites. Medium-to-high resolution satellite imagery has poor temporal resolution, which is further limited by cloud and aerosol blockage, posing a challenge for timely and accurate BAM. Medium-to-high resolution sensors offer more frequent imagery, but their spatial resolution limits their application for BAM. Our model, dubbed DLSR-FireCNet, comprises two spectral bands (Red and Near-Infrared; 250 m resolution) for deep feature extraction from bi-temporal pre- and post-fire imagery, with a target 30 m resolution BAM. DLSR-FireCNet has a cascading structure to preserve BA edges while alleviating missed detections and false alarms. Trained on 834 large wildfires from 2000 to 2007, the model's performance was rigorously evaluated in 91 out-of-sample large wildfires across the U.S. from 2008 to 2020. With an average Overall Accuracy of 0.98 and a Matthew's correlation coefficient of 0.89, DLSR-FireCNet not only outperformed state-of-the-art U-NET++, U-NET+++, Swin-Unet, and HR-Net models but also showed robust performance across various test areas. Additionally, DLSR-FireCNet markedly outperforms available global MCD64A1 and FireCCI burned area products on the test cases. The proposed model structure offers opportunities to develop accurate, medium-to-high resolution global burned area products for improved monitoring and mitigation of wildfires.
AB - Widespread availability of Earth observing satellites offer the much-needed information to monitor global wildfire activity. Here, we propose a novel decision level super-resolution deep learning burned area mapping (BAM) model based on the MODIS surface reflectance product, which resolves the limitations associated with both coarse and medium-to-high resolution satellites. Medium-to-high resolution satellite imagery has poor temporal resolution, which is further limited by cloud and aerosol blockage, posing a challenge for timely and accurate BAM. Medium-to-high resolution sensors offer more frequent imagery, but their spatial resolution limits their application for BAM. Our model, dubbed DLSR-FireCNet, comprises two spectral bands (Red and Near-Infrared; 250 m resolution) for deep feature extraction from bi-temporal pre- and post-fire imagery, with a target 30 m resolution BAM. DLSR-FireCNet has a cascading structure to preserve BA edges while alleviating missed detections and false alarms. Trained on 834 large wildfires from 2000 to 2007, the model's performance was rigorously evaluated in 91 out-of-sample large wildfires across the U.S. from 2008 to 2020. With an average Overall Accuracy of 0.98 and a Matthew's correlation coefficient of 0.89, DLSR-FireCNet not only outperformed state-of-the-art U-NET++, U-NET+++, Swin-Unet, and HR-Net models but also showed robust performance across various test areas. Additionally, DLSR-FireCNet markedly outperforms available global MCD64A1 and FireCCI burned area products on the test cases. The proposed model structure offers opportunities to develop accurate, medium-to-high resolution global burned area products for improved monitoring and mitigation of wildfires.
KW - Cascade
KW - Decision level super-resolution
KW - MODIS
KW - Machine learning
KW - Siamese
KW - U-Net
KW - Wildfire
UR - http://www.scopus.com/inward/record.url?scp=85219655602&partnerID=8YFLogxK
U2 - 10.1016/j.rsase.2025.101513
DO - 10.1016/j.rsase.2025.101513
M3 - Article
AN - SCOPUS:85219655602
VL - 37
JO - Remote Sensing Applications: Society and Environment
JF - Remote Sensing Applications: Society and Environment
M1 - 101513
ER -