Skip to main content

Robust Binary Loss For Multi-Category Classification With Label Noise

Defu Liu, Guowu Yang, Jinzhao Wu, Jiayi Zhao, Fengmao Lv

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
    Length: 00:07:53
09 Jun 2021

Deep learning has achieved tremendous success in image classification. However, the corresponding performance leap relies heavily on large-scale accurate annotations, which are usually hard to collect in reality. It is essential to explore methods that can train deep models effectively under label noise. To address the problem, we propose to train deep models with robust binary loss functions. To be specific, we tackle the $K$-class classification task by using $K$ binary classifiers. We can immediately use multi-category large margin classification approaches, e.g., Pairwise-Comparison (PC) or One-Versus-All (OVA), to jointly train the binary classifiers for multi-category classification. Our method can be robust to label noise if symmetric functions, e.g., the sigmoid loss or the ramp loss, are employed as the binary loss function in the framework of risk minimization. The learning theory reveals that our method can be inherently tolerant to label noise in multi-category classification tasks. Extensive experiments over different datasets with different types of label noise are conducted. The experimental results clearly confirm the effectiveness of our method.

Chairs:
Simone Milani

Value-Added Bundle(s) Including this Product

More Like This

  • SPS
    Members: Free
    IEEE Members: Free
    Non-members: Free
  • SPS
    Members: Free
    IEEE Members: $25.00
    Non-members: $40.00
  • SPS
    Members: Free
    IEEE Members: $25.00
    Non-members: $40.00