Abstract

AbstractThe emulation of human multisensory functions to construct artificial perception systems is an intriguing challenge for developing humanoid robotics and cross‐modal human–machine interfaces. Inspired by human multisensory signal generation and neuroplasticity‐based signal processing, here, an artificial perceptual neuro array with visual‐tactile sensing, processing, learning, and memory is demonstrated. The neuromorphic bimodal perception array compactly combines an artificial photoelectric synapse network and an integrated mechanoluminescent layer, endowing individual and synergistic plastic modulation of optical and mechanical information, including short‐term memory, long‐term memory, paired pulse facilitation, and “learning‐experience” behavior. Sequential or superimposed visual and tactile stimuli inputs can efficiently simulate the associative learning process of “Pavlov's dog”. The fusion of visual and tactile modulation enables enhanced memory of the stimulation image during the learning process. A machine‐learning algorithm is coupled with an artificial neural network for pattern recognition, achieving a recognition accuracy of 70% for bimodal training, which is higher than that obtained by unimodal training. In addition, the artificial perceptual neuron has a low energy consumption of ∼20 pJ. With its mechanical compliance and simple architecture, the neuromorphic bimodal perception array has promising applications in large‐scale cross‐modal interactions and high‐throughput intelligent perceptions.image

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call