Abstract

Classification tasks in 3D point clouds often assume that class events are independent and identically distributed (IID), although this assumption destroys the correlation between classes. This study proposes a classification strategy, Joint Graph Entropy Knowledge Distillation (JGEKD), suitable for non-independent and identically distributed 3D point cloud data, which achieves knowledge transfer of class correlations through knowledge distillation by constructing a loss function based on joint graph entropy. First, we employ joint graphs to capture the hidden relationships between classes and implement knowledge distillation to train our model by calculating the entropy of add graph. Subsequently, to handle 3D point clouds invariant to spatial transformations, we construct Siamese structures and develop two frameworks, self-knowledge distillation and teacher-knowledge distillation, to facilitate information transfer between different transformation forms of the same data. In addition, we use the above framework to achieve knowledge transfer between point clouds and their corrupted forms, and increase the robustness against corruption of model. Extensive experiments on ScanObject, ModelNet40, ScanntV2_cls and ModelNet-C demonstrate that the proposed strategy can achieve competitive results.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call