Spherical convolutional neural networks (Spherical CNNs) learn nonlinear representations from 3D data by exploiting the data structure and have shown promising performance in shape analysis, object classification, and planning among others. This paper investigates the properties that Spherical CNNs exhibit as they pertain to the rotational structure inherent in spherical signals. We build upon the rotation equivariance of spherical convolutions to show that Spherical CNNs are stable to general structure perturbations. In particular, we model arbitrary structure perturbations as diffeomorphism perturbations, and define the rotation distance that measures how far from rotations these perturbations are. We prove that the output change of a Spherical CNN induced by the diffeomorphism perturbation is bounded proportionally by the perturbation size under the rotation distance. This stability property coupled with the rotation equivariance provide theoretical guarantees that underpin the practical observations that Spherical CNNs exploit the rotational structure, maintain performance under structure perturbations that are close to rotations, and offer good generalization and faster learning.
Read full abstract