Abstract
Many recent works have combined two machine learning topics, learning of supervised distance metrics and manifold embedding methods, into supervised nonlinear dimensionality reduction methods. We show that a combination of an early metric learning method and a recent unsupervised dimensionality reduction method empirically outperforms previous methods. In our method, the Riemannian distance metric measures local change of class distributions, and the dimensionality reduction method makes a rigorous tradeoff between precision and recall in retrieving similar data points based on the reduced-dimensional display. The resulting supervised visualizations are good for finding (sets of) similar data samples that have similar class distributions.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.