Abstract

Weather radar data can capture large-scale bird migration information, helping solve a series of migratory ecological problems. However, extracting and identifying bird information from weather radar data remains one of the challenges of radar aeroecology. In recent years, deep learning was applied to the field of radar data processing and proved to be an effective strategy. This paper describes a deep learning method for extracting biological target echoes from weather radar images. This model uses a two-stream CNN (Atrous-Gated CNN) architecture to generate fine-scale predictions by combining the key modules such as squeeze-and-excitation (SE), and atrous spatial pyramid pooling (ASPP). The SE block can enhance the attention on the feature map, while ASPP block can expand the receptive field, helping the network understand the global shape information. The experiments show that in the typical historical data of China next generation weather radar (CINRAD), the precision of the network in identifying biological targets reaches up to 99.6%. Our network can cope with complex weather conditions, realizing long-term and automated monitoring of weather radar data to extract biological target information and provide feasible technical support for bird migration research.

Highlights

  • IoU in precipitation echo and biological echo by 1.6% and 1.7%, respectively; (3) our model achieves higher F-score and IoU than previous work in the segmentation precision of biological targets

  • Our results show that deep learning is an effective tool in biological echo extraction tasks in China next generation weather radar (CINRAD), and is likely to be applied to other radar data processing tasks

  • Extracting and identifying biological information was a long-term challenge in radar aeroeclogy that substantially limits the study of bird migration patterns

Read more

Summary

Introduction

We pay more attention to the performance of the network in biological echo. The recall of MistNet is close to or even higher than our network, and we assume that this is because MistNet tends to classify pixels as biological echo, so as to retain more biological information. Echoes with obvious characteristics of precipitation targets will be classified as precipitation This classification decision is not easy to lose biological information but will lead to a large number of pixels being incorrectly classified as biological categories, and has a lower precision of 93.5%. The precision of Atrous-Gated CNN is 6.1% higher, and the biology is still retained to the maximum extent, leading to a better F-score and IoU

Objectives
Methods
Results
Discussion
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call