Skip to main content

A Co-Interactive Transformer For Joint Slot Filling And Intent Detection

Libo Qin, Tailu Liu, Wanxiang Che, Bingbing Kang, Sendong Zhao, Ting Liu

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
    Length: 00:07:13
09 Jun 2021

Intent detection and slot filling are two main tasks for building a spoken language understanding (SLU) system. The two tasks are closely related and the information of one task can benefit the other. Previous studies either implicitly model the two tasks with multi-task framework or only explicitly consider the single information flow from intent to slot. None of the prior approaches model the bidirectional connection between the two tasks simultaneously in a unified framework. In this paper, we propose a Co-Interactive Transformer which considers the cross-impact between the two tasks. Instead of adopting the self-attention mechanism in vanilla Transformer, we propose a co-interactive module to consider the cross-impact by building a bidirectional connection between the two related tasks, where slot and intent can be able to attend on the corresponding mutual information. The experimental results on two public datasets show that our model achieves the state-of-the-art performance.

Chairs:
Sicheng Zhao

Value-Added Bundle(s) Including this Product

More Like This

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00