Abstract
In ground-based remote sensing cloud image observation, images with the highest possible resolution are captured to obtain sufficient information about clouds. However, when features are extracted and classification is performed on the basis of the original images, a high-resolution probably means a high (or even more, unacceptable) computation cost. In practical application, a simple and commonly adopted method is to appropriately resize the original image to a version with a decreased resolution. An inevitable problem is whether useful information is lost in this resizing operation. This paper demonstrates that information loss is inevitable and poor classification results may be obtained from the analysis of local binary pattern (LBP) histogram features. However, this problem has been always neglected in previous studies, and the original image is arbitrarily resized without any criterion. In particular, the histogram features based on LBPs actually reflect the distribution of features. Thus, a criterion based on the Kullback–Leibler divergence between LBP histograms from the original and resized images and a penalty term imposed on the resolution are proposed to select the resolution of the resized image. The optimal resolution of the resized image can be selected by minimizing this criterion. Furthermore, experiments based on three ground-based remote sensing cloud image data sets with different original resolutions validate this criterion by analyzing the LBP histogram features.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have