Abstract

Performances of deep neural networks are prone to be degraded by label noise due to their powerful capability in fitting training data. Deeming low-loss instances as clean data is one of the most promising strategies in tackling label noise and has been widely adopted by state-of-the-art methods. However, prior works tend to drop high-loss instances directly, neglecting their valuable information. To address this issue, we propose an end-to-end framework named Co-LDL, which incorporates the low-loss sample selection strategy with label distribution learning. Specifically, we simultaneously train two deep neural networks and let them communicate useful knowledge by selecting low-loss and high-loss samples for each other. Low-loss samples are leveraged conventionally for updating network parameters. On the contrary, high-loss samples are trained in a label distribution learning manner to update network parameters and label distributions concurrently. Moreover, we propose a self-supervised module to further boost the model performance by enhancing the learned representations. Comprehensive experiments on both synthetic and real-world noisy datasets are provided to demonstrate the superiority of our Co-LDL method over state-of-the-art approaches in learning with noisy labels. The source code and models have been made available at <uri xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">https://github.com/NUST-Machine-Intelligence-Laboratory/CoLDL</uri> .

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.