State-space modeling (SSM) provides a general framework for many image reconstruction tasks. Error in a priori physiological knowledge of the imaging physics, can bring incorrectness to solutions. Modern deep-learning approaches show great promise but lack interpretability and rely on large amounts of labeled data. In this paper, we present a novel hybrid SSM framework for electrocardiographic imaging (ECGI) to leverage the advantage of state-space formulations in data-driven learning. We first leverage the physics-based forward operator to supervise the learning. We then introduce neural modeling of the transition function and the associated Bayesian filtering strategy. We applied the hybrid SSM framework to reconstruct electrical activity on the heart surface from body-surface potentials. In unsupervised settings of both in-silico and in-vivo data without cardiac electrical activity as the ground truth to supervise the learning, we demonstrated improved ECGI performances of the hybrid SSM framework trained from a small number of ECG observations in comparison to the fixed SSM. We further demonstrated that, when in-silico simulation data becomes available, mixed supervised and unsupervised training of the hybrid SSM achieved a further 40.6% and 45.6% improvements, respectively, in comparison to traditional ECGI baselines and supervised data-driven ECGI baselines for localizing the origin of ventricular activations in real data.