Abstract

Searching for relevant 3D models based on hand-drawn sketches is important for many applications, such as sketch-based 3D modeling and recognition. Sketch-based shape retrieval (SBSR) has become a hot research spot in the field of model retrieval, pattern recognition, and computer vision. 3D deep representation based on Convolutional Neural Network (CNN) enables significant performance improvement over state-of-the-arts in task of 3D shape retrieval. Motivated by this, we proposed a sketch-based 3D model retrieval algorithm by utilizing representative views and CNN feature matching. The representative views are obtained by viewpoint entropy. The main idea of the method is that the hand-drawn sketch can be achieved according to a viewpoint of the 3D model. Thus the sketch and the projection of model from same class are similar. Therefore, we filter a certain amount of view as representative view to reduce the computational complexity and improve the accuracy. We extract CNN descriptors as features for representative view of each object. Our experiments on Shape Retrieval Contest (SHREC) 2012 database and SHREC 2013 database demonstrate that our method is better than state-of-the-art approaches.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.