In this work, we propose a multisensory mutual associative memory networks framework and memristive circuit to mimic the ability of the biological brain to make associations of information received simultaneously. The circuit inspired by neural mechanisms of associative memory cells mainly consists of three modules: 1) the storage neurons module, which encodes external multimodal information into the firing rate of spikes; 2) the synapse module, which uses the nonvolatility memristor to achieve weight adjustment and associative learning; and 3) the retrieval neuron module, which feeds the retrieval signal output from each sensory pathway to other sensory pathways, so that achieve mutual association and retrieval between multiple modalities. Different from other one-to-one or many-to-one unidirectional associative memory work, this circuit achieves bidirectional association from multiple modalities to multiple modalities. In addition, we simulate the acquisition, extinction, recovery, transmission, and consolidation properties of associative memory. The circuit is applied to cross-modal association of image and audio recognition results, and episodic memory is simulated, where multiple images in a scene are intramodal associated. With power and area analysis, the circuit is validated as hardware-friendly. Further research to extend this work into large-scale associative memory networks, combined with visual-auditory-tactile-gustatory sensory sensors, is promising for application in intelligent robotic platforms to facilitate the development of neuromorphic systems and brain-like intelligence.