Abstract
Spiking neural networks possess asynchronous, discrete, and sparse characteristics that enable direct processing of event-based tactile sensor data and more efficient transmission of information. These networks have been widely applied in the tactile perception field. However, due to the sparsity of tactile data, these models are prone to generalization issues. Additionally, existing methods for tactile graph construction are constrained when spatially connecting high-dimensional tactile data, making it challenging for the model to fuse and process such information effectively. To address these challenges, we propose the GGT-SNN, which introduces a parameter estimation method with Gaussian priors to alleviate generalization problems caused by sparse tactile data. Furthermore, we design M-tree and Z-tree tactile graph construction methods to compensate for deficiencies of other approaches when handling high-dimensional tactile spatial connections. This approach enhances the model’s ability to efficiently process high-dimensional event-driven tactile information. We introduce an approximate LIF neuron activation function to enable backpropagation within the model and comprehensively compare and evaluate its performance against different approximate functions. The experimental results demonstrate that our proposed method performs significantly better than SOTA methods, exhibiting a 10.83 % improvement over TactileSGNet on the EvTouch-Containers dataset and a 2.92 % improvement over TactileSGNet on the EvTouch-Objects dataset.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.