Abstract

In this paper, we propose a regularized point-to-manifold distance metric to measure the distance between the unknown query object and object-specific manifolds for the task multi-view multi-manifold learning. Our metric determine the class of query object based on the class of objects that have smallest average geodesic distance from it. To do this, we propose our proposed method with the two features: 1. We automatically discover the object-specific manifolds. In this section, we focus on the fundamental problem of efficiently selecting a uniform class-consistent neighbors from all available poses for graph-based multi-manifold learning methods in a supervised manner. Also, we extract the most distinctive exemplars from the manifold of each object that cover the possible variations on pose angles. To make a distinction between some object-specific pose-inconsistent and object-inconsistent pose-consistent that may be very close, we utilize the total variation regularized least squares problem to representing each object in a weighted sum of its class-consistent neighbors under different poses. 2. We use the information of k objects to decide about the class of query object. We measure the distance between the unknown object and k exemplars of each object-specific manifold to find the closest manifold as the class of query object. Numerical experiments on several benchmark multi-view datasets are reported, which provide excellent support to the proposed methods. In the mean, our neighborhood graph can improve the SH-NGC and UTDTV, as new supervised multi-manifold learning and unsupervised multi-view multi-manifold learning methods, more than 2.6 and 7%, respectively. Also, in object recognition, our proposed method achieves more than 5% better results respect to the best result of state-of-the-art graph-based manifold learning methods.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.