Abstract
Motivated by the recent success of deep networks in providing effective and abstract image representations, in this paper, a multi-layer architecture called the multi-layer local energy patterns (ML-LEP) is proposed for texture representation and classification. The proposed approach follows a multi-layer convolutional neural network paradigm and is built upon the single-layer local energy pattern (LEP) approach, a statistical histogram-based method for texture representation. An important aspect of the proposed multi-layer method compared to other deep convolutional architectures is bypassing the computationally expensive learning stage using fixed filters. As such, the proposed training-free network circumvents the need for large data for learning system parameters. An extensive investigation is carried out to determine the merits of different nonlinear operators in the proposed architecture. For this purpose, different nonlinearities including an energy-based nonlinearity, the absolute operator as well as the rectifier functions are extensively investigated and compared against each other. Extensive experiments conducted on three challenging databases of KTH-TIPS, KTH-TIPS2-a and the UIUC indicate that the extension of the LEP method to the multi-layer LEP is effective and leads to better performance. Moreover, the proposed ML-LEP approach is compared to several other well-known descriptors in the field, achieving the best reported performance on the KTH-TIPS and the KTH-TIPS2-a databases despite being training-free.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.