Skip to main content
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
    Length: 00:09:20
08 Jun 2021

Open-domain multi-turn conversations mainly have three fea-tures, which are hierarchical semantic structure, redundant in-formation, and long-term dependency. Grounded on these,selecting relevant context becomes a challenge step for multi-turn dialogue generation. However, existing methods cannotdifferentiate both useful words and utterances in long dis-tances from a response. Besides, previous work just performscontext selection based on a state in the decoder, which lacksa global guidance and could lead some focuses on irrelevantor unnecessary information. In this paper, we propose a novelmodel with hierarchical self-attention mechanism and distantsupervision to not only detect relevant words and utterances inshort and long distances, but also discern related informationglobally when decoding. Experimental results on two publicdatasets of both automatic and human evaluations show thatour model significantly outperforms other baselines in termsof fluency, coherence and informativeness.

Chairs:
Yang Liu

Value-Added Bundle(s) Including this Product

More Like This

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00