Abstract

Retrieving informative images from the large-scale aurora data is of great significance in the field of space physics. In this article, we propose a hierarchical deep embedding (HDE) model to assist scientists for their aurora image retrieval. Other than conventional bag-of-words (BoW) models employing local cues individually, HDE performs visual matching in a hierarchical way, that is, only keypoints which are similar on local, regional, and global simultaneously can be treated as a true match. The added contextual evidences can effectively alleviate the occurrence of false matches and improve the precision of visual matching. Specifically, to complement the local SIFT feature, the convolutional neural network (CNN) is refined with a polar region pooling (PRP) layer to extract features from regional patches and global image, forming a group of hierarchical deep features with strong discriminative power. Also, an improved polar meshing (IPM) scheme is presented to determine the positions of keypoints, which is more suitable for images captured by circular fisheye lens and capable of reflecting the physical information in aurora images. Extensive experiments are conducted on the big aurora data, which indicate that the proposed HDE model greatly promotes the retrieval accuracy with acceptable memory cost and efficiency. In addition, the effectiveness of the IPM scheme and the superiority of the hierarchical deep feature integration are separately demonstrated.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call