Abstract

A new early stopping criterion that combines the probability density function error and the validation error was designed to evaluate a training process. In this new criterion, the real status of the backpropagation neural network (BPNN) was jointly evaluated by the validation error, the probability density error of the traditional training data, and the unlabeled data that was not considered in the traditional training process to determine the appropriate stopping time. The proposed approach was tested on three classes of automatic stopping criteria. In addition, it was compared with the stopping criteria based solely on the validation error and the probability density error with the use of four different data sets selected from the UCI Machine Learning Repository. Results showed that the new early stopping criterion is beneficial for enhancing the generalization capability of the BPNN, demonstrating an improved forecasting precision by 2%-3%.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call