Skip to main content
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
    Length: 12:17
04 May 2020

Generating informative responses by incorporating external knowledge into dialogue system attracts more and more attention. Most previous works facilitate single-turn dialogue system on generating such responses. However, few works focus on incorporating knowledge for multi-turn system, since the hierarchy of knowledge, from the words and utterances in context, is ignored. Motivated by this, we propose a novel hierarchical knowledge attention (HKA) mechanism for open-domain multi-turn dialogue system in this paper, which utilizes both word and utterance level attention jointly. Experiments demonstrate that the proposed HKA can incorporate more appropriate knowledge and make the state-of-the-art models generate more informative responses. Further analysis shows that our HKA can improve the model's ability of dialogue state management, especially when the number of dialogue turns is large.

Value-Added Bundle(s) Including this Product

More Like This

  • SPS
    Members: $150.00
    IEEE Members: $250.00
    Non-members: $350.00
  • SPS
    Members: $150.00
    IEEE Members: $250.00
    Non-members: $350.00
  • SPS
    Members: $150.00
    IEEE Members: $250.00
    Non-members: $350.00