Abstract
The class imbalance problem, which has been recognized in many real-world applications, negatively affects the performance of neural networks. To address the class imbalance problem, resampling and cost-sensitive learning have been commonly applied. In this study, we propose a novel training method for neural networks based on adaptive noise, named ADANOISE. This method incorporates the ideas of both resampling and cost-sensitive learning to improve the training of a neural network under class imbalance. For the neural network, random noise is added to the input when it learns from the minority class. To make the learning objective cost-sensitive, each minority class instance is oversampled by adding different noise vectors randomly sampled from a noise distribution. The neural network and the parameter of the noise distribution are simultaneously trained. By doing so, the noise distribution adapts to the training data in a data-driven fashion toward improving the performance of the neural network. We demonstrate the effectiveness of the proposed method through experiments on benchmark datasets.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.