Abstract
Dropout has been widely adopted to regularize graph convolutional networks (GCNs) by randomly zeroing entries of the node feature vectors and obtains promising performance on various tasks. However, the information of individually zeroed entries could still present in other correlated entries by propagating (1) spatially between entries of different node feature vectors and (2) depth-wisely between different entries of each node feature vector, which essentially weakens the effectiveness of dropout. This is mainly because in a GCN, neighboring node feature vectors after linear transformations are aggregated to produce new node feature vectors in the subsequent layer. To effectively regularize GCNs, we devise DropCluster which first randomly zeros some seed entries and then zeros entries that are spatially or depth-wisely correlated to those seed entries. In this way, the information of the seed entries is thoroughly removed and cannot flow to subsequent layers via the correlated entries. We validate the effectiveness of the proposed DropCluster by comprehensively comparing it with dropout and its representative variants, such as SpatialDropout, Gaussian dropout and DropEdge, on skeleton-based action recognition.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.