Resource-constrained computing devices such as those used in IoTs require low-power, high performance, and small size to be enabled to operate efficiently. Resistive random access memory (ReRAM) is a promising technology for building novel in-memory computing architectures, due to its ability to perform storage and computation using the same physical element with low energy and high density. ReRAM-based search engines and neural network (NN) accelerators have grown significantly especially for IoT devices. In this paper, we propose a memristor-based voltage-resistance xnor (VR-XNOR) cell. The advantages of this cell are demonstrated through building a reconfigurable content-addressable memory (CAM) architecture that can support binary-CAM (BCAM) and ternary-CAM (TCAM) and enable approximate search operations. Moreover, the memristor-based VR-XNOR cell is utilized for binarized convolutional neural networks (CNN) with focus on the convolution operation. This is achieved by replacing the convolution module with XNOR-based filter banks. Simulations of the proposed architectures for search engine and feature extraction were carried out using VTEAM model in Cadence Virtuoso Analog Design Environment. The proposed filter bank architecture achieves a 1-ns extraction cycle time over ${N}$ filters and produces multiple output feature maps in a single processing cycle. The filter uses two memristor devices to realize each XNOR gate and has shown a significant reduction in the number of multiply-add operations.