Abstract

We propose a novel multiple kernel learning (MKL) algorithm with a group lasso regularizer, called group lasso regularized MKL (GL-MKL), for heterogeneous feature selection. We extend the existing MKL algorithm and impose a mixed l 1 and l 2 norm constraint (known as group lasso) as the regularizer. Our GL-MKL determines the optimal base kernels, including the associated weights and kernel parameters, and results in a compact set of features for comparable or improved recognition performance. The use of our GL-MKL avoids the problem of choosing the proper technique to normalize the feature attributes collected from heterogeneous domains (and thus with different properties and distribution ranges). Our approach does not need to exhaustively search for the entire feature space when performing feature selection like prior sequential-based feature selection methods did, and we do not require any prior knowledge on the optimal size of the feature subset either. Comparisons with existing MKL or sequential-based feature selection methods on a variety of datasets confirm the effectiveness of our method in selecting a compact feature subset for comparable or improved classification performance.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.