Abstract

Local structural information is crucially important for human visual system to perceive natural scenes. Recent years, a variety of local image structure description methods have been proposed for the artificial modeling of visual perception. Although existing local image structure descriptors have shown successful performances, one general limitation is their numerical instability caused by ignoring the information of the spatial correlation of local orientation. In this paper, we propose a local image structure descriptor by modeling the anisotropic mechanism in the primary visual cortex. In particular, the pixel-wise anisotropy values of a given image are calculated by pseudo-Wigner-Ville distribution (PWVD) and Renyi entropy. Then the excitatory/inhibitory interactions among visual neurons in the local receptive field are modeled by measuring the similarities between their anisotropies. By mapping visual neurons to image pixels, the correlation between a central pixel and its local neighbors can be represented by a binary pattern which is named as local anisotropic pattern (LAP). Experimental results on texture classification verified that the proposed LAP has satisfactory texture classification accuracy, rotation invariance, and noise robustness; experimental results on no-reference image quality assessment demonstrated that the proposed LAP achieves state-of-the-art performance in objective evaluation of the perceptual quality of natural image, and this reflects that LAP can accurately represent the degradation of local image structure.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call