Abstract
We present an object recognition approach using co-occurrence similarities of Gabor magnitude textures in this paper. A novel image descriptor, multichannel Gabor magnitude co-occurrence matrices (MGMCMs), is designed to characterize Gabor textures for object representation and similarity matching. The descriptor is a generalization of multichannel color co-occurrence matrices (MCMs), which focus on using robust and discriminative magnitude textures in filtered images. Our approach starts from Gabor wavelet transformation of each object image. An exploratory learning algorithm is proposed for learning channel-adaptive magnitude truncation parameters and level parameters. This allows us to design the magnitude quantization that can reduce overall biased and peaked levels of resulting feature distributions in each channel, to avoid over-sparse co-occurrence distributions on average. The direction-based grouping is adopted for computational complexity reduction of MGMCMs extraction under a specific neighborhood mode on the grouped rescaled magnitude images of per object image. When each MGMCM is treated as a probability distribution lying on a multinomial manifold, we represent per object image as a point on a product multinomial manifold. Using multinomial geometry and metric extension technique, we construct the p-order Minkowski co-occurrence information distance for similarity matching between the albums of Gabor magnitude textures. The feasibility and effectiveness of the approach is validated by the experimental results on the Yale and FERET face databases, PolyU palmprint database, COIL-20 object database and Zurich buildings database.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.