The smooth interaction and coordination between lower-limb amputees and their prosthetics is crucial when performing complex tasks such as running. To address this, simultaneous and proportional control (SPC) based on continuous estimation of joint angles through surface electromyogram (sEMG) signals offers a more seamless interaction between users and their prosthetic limbs than conventional pattern recognition-based control, which relies on recognizing pre-defined movement classes from specific sEMG patterns using classification algorithms. This study proposes a deep learning-based model (AM-BiLSTM) that integrates the attention mechanism (AM) and bidirectional long short-term memory (BiLSTM) network to provide an accurate and robust SPC for estimating knee joint angle during running from a minimal number of sEMG sensors. The sEMG signals of four muscles recorded from 14 subjects during treadmill running at various speeds were utilized by the AM-BiLSTM model to decode knee joint kinematics. To comprehensively investigate the generalizability of the proposed model, it was tested in intra-subject and speed, intra-subject and inter-speed, and inter-subject and speed scenarios and compared with BiLSTM, standard LSTM, and multi-layer perceptron (MLP) approaches. Normalized root-mean-square error and correlation coefficient were used as performance metrics. According to nonparametric statistical tests, the proposed AM-BiLSTM model significantly outperformed the BiLSTM network as well as classical techniques including LSTM and MLP networks in all three experiments (p-value < 0.05) and achieved state-of-the-art performance. Based on our findings, the AM-BiLSTM model may facilitate the deployment of intuitive user-driven deep learning-based control schemes for a wide variety of miniaturized robotic lower-limb devices, including myoelectric prostheses.