Skip to main content
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
    Length: 00:13:12
10 Jun 2021

Conditional variational autoencoder (cVAE) has shown promising performance in dialogue generation. However, there still exists two issues in dialog cVAE model. The first issue is the Kullback-Leiblier (KL) vanishing problem which results in degenerating cVAE into a simple recurrent neural network. The second issue is the assumption of isotropic Gaussian prior for latent variable which is too simple to assure diversity of the generated responses. To handle these issues, a simple distribution should be transformed into a complex distribution and simultaneously the value of KL divergence should be preserved. This paper presents the dialogue flow VAE (DF-VAE) for variational dialogue generation. In particular, KL vanishing is tackled by a new normalizing flow. An inverse autoregressive flow is proposed to transform isotropic Gaussian prior to a rich distribution. In the experiments, the proposed DF-VAE is significantly better than the other methods in terms of different evaluation metrics. The diversity of generated dialogue responses is enhanced. Ablation study is conducted to illustrate the merit of the proposed flow models.

Chairs:
Eric Fosler-Lussier

Value-Added Bundle(s) Including this Product

More Like This

  • SPS
    Members: Free
    IEEE Members: Free
    Non-members: Free
  • SPS
    Members: Free
    IEEE Members: $25.00
    Non-members: $40.00
  • SPS
    Members: Free
    IEEE Members: $25.00
    Non-members: $40.00