PRECOGNITION IN CONTEXTUAL SPOKEN LANGUAGE UNDERSTANDING VIA KNOWLEDGE DISTILLATION
Nan Su (Ant Group); Bingzhu Du (Ant Group); Yuchi Zhang (Ant Financial Services Group); Chao Liu (Ant Group ); Yongliang Wang (Ant Group); Hong Chen (Ant Group ); xin lu (ant group)
-
SPS
IEEE Members: $11.00
Non-members: $15.00
Task-oriented dialogue systems have become overwhelmingly popular in recent researches. Spoken Language Understanding (SLU) is widely used to extract the semantics frame of user queries and comprehend users’ intent/emotion/dialogue state in task-oriented dialogue systems. Most previous works on such discriminative tasks only models current query or historical conversations. Even if the entire conversation flow is modeled in some work, it is not suitable for real-world task-oriented dialogue systems, where future contexts are not visible until a response is given based on the current dialogue state. In this paper, we propose to jointly model historical and future information using knowledge distillation methods to address the discrepancy between offline and online information in dialogue understanding.