Abstract
In recent years, sparse representation theory has attracted the attention of many researchers in the signal processing, pattern recognition and computer vision communities. The choice of dictionary matrix plays a key role in the sparse representation based methods. It can be a pre-defined dictionary or can be learned via an optimization procedure. Furthermore, the dictionary learning process can be extended to a non-linear setting using an appropriate kernel function in order to handle non-linear structured data. In this framework, the choice of kernel function is also a key step. Multiple kernel learning is an appealing strategy for dealing with this problem. In this paper, within the framework of kernel sparse representation based classification, we propose an iterative algorithm for coincident learning of the dictionary matrix and multiple kernel function. The weighted sum of a set of basis functions is considered as the multiple kernel function where the weights are optimized such that the reconstruction error of the sparse coded data is minimized. In our proposed algorithm, the sparse coding, dictionary learning and multiple kernel learning processes are performed in three steps. The optimization process is performed considering two different structures namely distributive and collective for the sparse representation based classifier. Our experimental results show that the proposed algorithm outperforms the other existing sparse coding based approaches. These results also confirm that the collective setting leads to better results when the number of training examples is limited. On the other hand, the distributive setting is more appropriate when there are enough training samples.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.