Abstract
This paper reports the results of the SHREC’22 track: Open-Set 3D Object Retrieval, the goal of which is to evaluate the performance of different retrieval algorithms under the Open-Set setting and modality-missing setting, respectively. Since objects from unseen categories are very common in real-world applications, we design the open-set 3D object retrieval to expand the application of traditional 3D object retrieval. In this track, we generate open-set 3D object retrieval datasets OS-MN40 and OS-MN40-Miss based on the ModelNet40 dataset, which are collected for the open-set setting and both open-set setting and modality-missing setting, respectively. Both the two datasets include the training set (2822 objects from 8 categories) and the retrieval set (960 query objects and 8527 target objects from the other 32 categories). The categories of retrieval (query/target) sets are not seen in the training set. For each object in the OS-MN40, four types of modalities, including mesh, point cloud, multi-view, and voxel, are provided. Each object in the OS-MN40-Miss is represented with incomplete modality information, which is collected to simulate the retrieval task in the real world. This track attracted eight participants from four countries and 191 runs of all submissions. The evaluation results show a promising scenario about open-set retrieval on 3D objects with multi-modal and multi-resolution representation and reveal interesting insights in dealing with retrieving 3D objects in unknown-category objects.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.