Conventional hybrid models often miss an essential factor that can lead to less effective performance: intrinsic sequence dependence when combining various neural network (NN) architectures. This study addresses this issue by highlighting the importance of sequence hybridization in NN architecture integration, aiming to improve model effectiveness. It combines NN layers—dense, long short-term memory (LSTM), and gated recurrent unit (GRU)—using the Keras Sequential API for defining the architecture. To provide better context, bidirectional LSTM (BiLSTM) and bidirectional GRU (BiGRU) replace their unidirectional counterparts, enhancing the models through bidirectional structures. Out of 25 NN models tested, 18 four-layer hybrid NN models consist of one-quarter dense layer and the rest BiLSTM and BiGRU layers. These hybrid NN models undergo supervised learning regression analysis, with mean column-wise root mean square error (MCRMSE) as the performance metric. The results show that each hybrid NN model produces unique outcomes based on its specific hybrid sequence. The Hybrid_LGSS model performs better than existing three-layer BiLSTM networks in predictive accuracy and shows lower overfitting (MCRMSEs of 0.0749 and 0.0767 for training and validation, respectively). This indicates that the optimal hybridization sequence is crucial for achieving a balance between performance and simplicity. In summary, this research could help vaccinologists develop better mRNA vaccines and provide data analysts with new insights for improvement.
Read full abstract