INTRODUCTION: Introducing a Probabilistic Descent Ensemble (PDE) approach for enhancing malware prediction through deep learning leverages the power of multiple neural network models with distinct architectures and training strategies to achieve superior accuracy while minimizing false positives. OBJECTIVES: Combining Stochastic Gradient Descent (SGD) with early stopping is a potent approach to optimising deep learning model training. Early stopping, a vital component, monitors a validation metric and halts training if it stops improving or degrades, guarding against overfitting. METHODS: This synergy between SGD and early stopping creates a dynamic framework for achieving optimal model performance adaptable to diverse tasks and datasets, with potential benefits including reduced training time and enhanced generalization capabilities. RESULTS: The proposed work involves training a Gaussian NB classifier with SGD as the optimization algorithm. Gaussian NB is a probabilistic classifier that assumes the features follow a Gaussian (normal) distribution. SGD is an optimization algorithm that iteratively updates model parameters to minimize a loss function. CONCLUSION: The proposed work gives an accuracy of 99% in malware prediction and is free from overfitting and local minima.
Read full abstract