Pseudo-crossbar arrays using ferroelectric field effect transistor (FEFET) mitigates weight movement and allows <i>in situ</i> vector–matrix multiplication (VMM), which can significantly accelerate online training of deep neural networks (DNNs). However, the training accuracy of DNNs using conventional FEFETs is low because of the non-idealities, such as nonlinearity, asymmetry, limited bit precision, and limited dynamic range of the weight updates. The limited endurance of these devices degrades the training accuracy further. Here, we show a novel approach for designing the gate-stack of an FEFET analog synapse using a superlattice (SL) of ferroelectric (FE)/dielectric (DE)/FE. The partial polarization states are stabilized by harnessing the depolarization field from the DE spacer, which mitigates the weight update non-idealities. We demonstrate a 7-bit SL-FEFET analog synapse with improved weight update profile, resulting in 94.1% online training accuracy for MNIST handwritten digit classification task. The device uses an indium–tungsten–oxide (IWO) channel and back-end-of line (BEOL)-compatible process flow. The absence of low-<inline-formula> <tex-math notation="LaTeX">${k}$ </tex-math></inline-formula> interlayer (IL) results in high endurance (>10<sup>10</sup> cycles), while the BEOL compatibility paves the way to high-density integration of pseudo-crossbar arrays and flexibility for neuromorphic circuit design.