Skip to main content

LABEL-GUIDED CONTRASTIVE LEARNING FOR OUT-OF-DOMAIN DETECTION

Shun Zhang (Beihang University); Tongliang Li (Beihang University); Jiaqi Bai (Beihang University); Zhoujun Li (Beihang University)

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
07 Jun 2023

Out-of-Domain (OOD) detection from user utterances plays an important part in task-oriented dialogue systems. Recent studies utilize supervised or self-supervised contrastive learning (CL) to learn discriminative semantic features for OOD detection. The supervised contrastive learning (SCL) methods only model class-level features of different in-domain (IND) intents while self-supervised CL (SSCL) methods can model instance-level features. However, the SSCL methods require complex data augmentation and are vulnerable to intrinsic false-negative pairs. To address the issues above and leverage both types of CL, we propose a novel Label-Guided Contrastive Learning (LGCL) framework. LGCL models both instance-level and class-level discriminative semantic representations by employing the sample and its corresponding label simultaneously, such that the prior knowledge of IND can be fully leveraged. Experiment and analysis results on two public benchmark datasets show that the proposed method significantly outperforms baselines on the OOD detection task.

More Like This

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00