Abstract

The Hilbert–Schmidt independence criterion (HSIC) was originally designed to measure the statistical dependence of the distribution-based Hilbert space embedding in statistical inference. In recent years, it has been witnessed that this criterion can tackle a large number of learning problems owing to its effectiveness and high efficiency. In this article, we provide an in-depth survey of learning methods using the HSIC for various learning problems, like feature selection, dimensionality reduction, clustering, and kernel learning and optimization. Specifically, after introducing the basic idea of HISC, we systematically review the typical learning models based on the HISC, ranging from supervised learning to unsupervised learning, as well as from traditional machine learning to transfer learning and deep learning, followed by remaining challenges and future directions. The relationships between learning methods using the HSIC and other relevant learning algorithms are also discussed. We expect to provide practitioners valuable guidelines for their specific domains by elucidating the similarities and differences of these learning models.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.