Learning To Select Context In A Hierarchical And Global Perspective For Open-Domain Dialogue Generation
Lei Shen, Haolan Zhan, Xin Shen, Yang Feng
-
SPS
IEEE Members: $11.00
Non-members: $15.00Length: 00:09:20
Open-domain multi-turn conversations mainly have three fea-tures, which are hierarchical semantic structure, redundant in-formation, and long-term dependency. Grounded on these,selecting relevant context becomes a challenge step for multi-turn dialogue generation. However, existing methods cannotdifferentiate both useful words and utterances in long dis-tances from a response. Besides, previous work just performscontext selection based on a state in the decoder, which lacksa global guidance and could lead some focuses on irrelevantor unnecessary information. In this paper, we propose a novelmodel with hierarchical self-attention mechanism and distantsupervision to not only detect relevant words and utterances inshort and long distances, but also discern related informationglobally when decoding. Experimental results on two publicdatasets of both automatic and human evaluations show thatour model significantly outperforms other baselines in termsof fluency, coherence and informativeness.
Chairs:
Yang Liu