Abstract

During the training of neural networks, selecting the right stopping criterion is crucial to prevent overfitting and conserve computing power. While the early stopping and the maximum number of epochs stopping methods are simple to implement, they have limitations in identifying the point during training where the training and the validation loss start to diverge. To overcome these limitations, we propose a general correlation-based stopping criterion called the Correlation-Driven Stopping Criterion (CDSC). The CDSC stops the training process when the rolling Pearson correlation of the loss metrics between the training and validation datasets decreases below a pre-defined threshold. To show the effectiveness of the newly proposed Correlation-Driven Stopping Criterion, its effectiveness was compared with the effectiveness of the early stopping and the maximum number of epochs stopping methods across multiple common machine learning problems and neural network models. Our study shows that the proposed Correlation-Driven Stopping Criterion can enhance the out-of-sample performance of all tested neural network models while conserving computing power.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call