SummaryThis paper presents a charge‐based integrate‐and‐fire (IF) circuit for in‐memory binary spiking neural networks (BSNNs). The proposed IF circuit can mimic both addition and subtraction operations that permit better incorporation with in‐memory XNOR‐based synapses to implement the BSNN processing core. To evaluate the proposed design, we have developed a framework that incorporates the circuit's imperfections effects into the system‐level simulation. The array circuits use 2T‐2J Spin‐Transfer‐Torque Magnetoresistive RAM (STT‐MRAM) based on a 65‐nm commercial CMOS and a fitted magnetic tunnel junction (MTJ). The system model has been described in Pytorch to best fit the extracted parameters from circuit levels, including the cover of device nonidealities and process variations. The simulation results show that the proposed design can achieve a performance of 5.10 fJ/synapse and reaches 82.01% classification accuracy for CIFAR‐10 under process variation.