Skip to main content

CONTRASTIVE LEARNING WITH DIALOGUE ATTRIBUTES FOR NEURAL DIALOGUE GENERATION

Jie Tan (The Chinese University of Hong Kong); Hengyi Cai (Baidu Inc.); Hongshen Chen (JD.com); Hong Cheng (Chinese University of Hong Kong); Helen Meng (The Chinese University of Hong Kong); Zhuoye Ding (JD.com)

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
06 Jun 2023

Designing an effective learning method remains a challenge in neural dialogue generation systems as it requires the training objective to well approximate the intrinsic human-preferred dialogue properties. Conventional training approaches such as maximum likelihood estimation focus on modeling general syntactic patterns and may fail to capture intricate conversational characteristics. Contrastive dialogue learning offers an effective training schema by explicitly training a neural dialogue model on multiple positive and negative conversational pairs. However, constructing contrastive learning pairs is non-trivial, and multiple dialogue attributes have been found to be crucial for governing the human judgments of conversations. This paper proposes to guide the response generation with attribute-aware contrastive learning to improve the overall quality of the generated responses, where contrastive learning samples are generated according to various important dialogue attributes each specializing in a different principle of conversation. Extensive experiments show that our proposed techniques are crucial to achieving superior model performance.

More Like This

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00