Hka: A Hierarchical Knowledge Attention Mechanism For Multi-Turn Dialogue System
Jian Song, Kailai Zhang, Xuesi Zhou, Ji Wu
-
SPS
IEEE Members: $11.00
Non-members: $15.00Length: 12:17
Generating informative responses by incorporating external knowledge into dialogue system attracts more and more attention. Most previous works facilitate single-turn dialogue system on generating such responses. However, few works focus on incorporating knowledge for multi-turn system, since the hierarchy of knowledge, from the words and utterances in context, is ignored. Motivated by this, we propose a novel hierarchical knowledge attention (HKA) mechanism for open-domain multi-turn dialogue system in this paper, which utilizes both word and utterance level attention jointly. Experiments demonstrate that the proposed HKA can incorporate more appropriate knowledge and make the state-of-the-art models generate more informative responses. Further analysis shows that our HKA can improve the model's ability of dialogue state management, especially when the number of dialogue turns is large.