Abstract

The tactile properties of objects are important for robotic dexterous manipulation. An increasing number of attempts have recently been made to enable tactile information processing in robotic hand via tactile sensors. However, it remains relatively unexplored how to build tactile information processing models. In this study, we aimed to develop a spiking neural network (SNN) based on neural information processing mechanisms in sensory afferents. The SNN processes electrical signals collected from tactile sensor arrays attached to the gripper of the robotic hand while grasping objects with different shapes. We converted each of 42 -channel sensor signals from 2 arrays of 21 sensors into a spike train using the Izhikevich model, which was then fed to the SNN. The synaptic weights of the SNN were learned by the Hebbian learning through pair-based spike timing- dependent plasticity (STDP) algorithm. In addition, we implemented lateral inhibition of the second-layer neurons based on unsupervised learning similar to the one used in self-organizing maps, resulting in a winner-takes-all network. By this unsupervised learning, SNN could learn to discriminate the shape of objects via tactile sensing. In particular, it demonstrated object shape recognition with 100% accuracy. The proposed model could be useful for robots manipulating objects with tactile senses.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.