Abstract

Seeing the cloud and then understanding the weather is one of the important means for people to forecast weather. There has been a certain progress in the use of deep learning technology for weather forecasting, especially in the automatic understanding of disaster weather from satellite image, which can be seen as the image classification problem. Publicly available satellite image benchmark database tries to link weather directly with satellite images. However, single image modal is far from enough to correctly identify weather systems and clouds. Thus, we integrate images with Meteorological Elements, in which five kinds of meteorological elements, such as season, month, date stamp and geographic longitude & latitude are labeled. To effectively use such various modalities for clouds and weather systems identification through satellite image classification tasks, we propose a new satellite image classification framework: Multi modal auxiliary network (MANET). MANET consists of three parts: image feature extraction module based on convolutional neural network, meteorological information feature extraction module based on perceptron and layer-level multi-modal fusion. MANET successfully integrates the multi modal information including meteorological elements and satellite images. The experimental results show that MANET can achieve better weather systems, clouds and land cover classification results based on satellite images.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call