Abstract

We investigate the problem of efficiently localizing sketch depicted scenes in a remote sensing image dataset. We pose the problem as that of remote sensing image retrieval with sketch queries and explore the use of hashing techniques to achieve efficient retrieval. Given two training datasets of sketches and remote sensing images that have a common set of class labels, we develop a hashing strategy that coconstructs two hash code books for the sketches and the remote sensing images separately. The hash code book coconstruction strategy encourages hash codes for the sketches and remote sensing images from different classes to be far away from one another and those from the same class to be close. This property is maintained by two cohesion intensive cues: 1) an interclass pairwise disperse cue (InterPDC) and 2) an intraclass pairwise balance cue (IntraPBC). We use the two coconstructed hash code books for training two linear mapping models that generate hash codes for sketches and remote sensing images separately. Sorting the Hamming distance between the sketch hash codes and the remote sensing image hash codes renders efficient remote sensing image retrieval with sketch queries. This enables localizing the sketch depicted scenes in the remote sensing image dataset. In addition, our method can also be used for fast localizing sketch depicted scenes in a remote sensing image of large size. Extensive experiments on public datasets validate the effectiveness and efficiency of our method.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call