Skip to main content
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
    Length: 10:11
27 Oct 2020

Filtering noisy labels is crucial for robust training of deep neural networks. To train networks with noisy labels, sampling methods have been introduced, which sample the reliable instances to update networks using only sampled data. Since they rarely employ the non-sampled data for training, these methods have a fundamental limitation that they reduce the amount of the training data. To alleviate this problem, our approach aims to fully utilize the whole dataset by leveraging the information of the sampled data. To this end, we propose a novel graph-based learning framework that enables networks to propagate the label information of the sampled data to adjacent data, whether they are sampled or not. Also, we propose a novel self-training strategy to utilize the non-sampled data without labels and to regularize the network update using the information of the sampled data. Our method outperforms state-of-the-art sampling methods.

Value-Added Bundle(s) Including this Product

More Like This

  • SPS
    Members: $150.00
    IEEE Members: $250.00
    Non-members: $350.00
  • SPS
    Members: $150.00
    IEEE Members: $250.00
    Non-members: $350.00
  • SPS
    Members: $150.00
    IEEE Members: $250.00
    Non-members: $350.00