Neuromuscular injuries can impair hand function and profoundly impacting the quality of life. This has motivated the development of advanced assistive robotic hands. However, the current neural decoder systems are limited in their ability to provide dexterous control of these robotic hands. In this study, we propose a novel method for predicting the extension and flexion force of three individual fingers concurrently using high-density electromyogram (HD-EMG) signals. Our method employs two deep forest models, the flexor decoder and the extensor decoder, to extract relevant representations from the EMG amplitude features. The outputs of the two decoders are integrated through linear regression to predict the forces of the three fingers. The proposed method was evaluated on data from three subjects and the results showed that it consistently outperforms the conventional EMG amplitude-based approach in terms of prediction error and robustness across both target and non-target fingers. This work presents a promising neural decoding approach for intuitive and dexterous control of the fingertip forces of assistive robotic hands.