Abstract

Synthetic Aperture Radar (SAR) imagery plays an important role in observing tropical cyclones (TCs). However, the C-band attenuation caused by rain bands and the problem of signal saturation at high wind speeds make it impossible to retrieve the fine structure of TCs effectively. In this paper, a dual-level contextual attention generative adversarial network (DeCA-GAN) is tailored for reconstructing SAR wind speeds in TCs. The DeCA-GAN follows an encoder–neck–decoder architecture, which works well for high wind speeds and the reconstruction of a large range of low-quality data. A dual-level encoder comprising a convolutional neural network and a self-attention mechanism is designed to extract the local and global features of the TC structure. After feature fusion, the neck explores the contextual features to form a reconstructed outline and up-samples the features in the decoder to obtain the reconstructed results. The proposed deep learning model has been trained and validated using the European Centre for Medium-Range Weather Forecasts (ECMWF) atmospheric model product and can be directly used to improve the data quality of SAR wind speeds. Wind speeds are reconstructed well in regions of low-quality SAR data. The root mean square error of the model output and ECMWF in these regions is halved in comparison with the existing SAR wind speed product for the test set. The results indicate that deep learning methods are effective for reconstructing SAR wind speeds.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call