Skip to main content
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
    Length: 00:10:20
09 Jun 2021

Spoken Language Understanding (SLU) is an essential part of the spoken dialogue system, which typically consists of intent detection (ID) and slot filling (SF) tasks. During the conversation, most utterances of people contain rich sentimental information, which is helpful for performing the ID and SF tasks but ignored to be explored by existing works. In this paper, we argue that implicitly introducing sentimental features can promote SLU performance. Specifically, we present a Multi-task Learning (MTL) framework to implicitly extract and utilize the aspect-based sentimental text features. Besides, we introduce an Iteratively Co-Interactive Network (ICN) for the SLU task to fully utilize the comprehensive text features. Experimental results show that with the external BERT representation, our framework achieves new state-of-the-art on two benchmark datasets, i.e., SNIPS and ATIS.

Chairs:
Jasha Droppo

Value-Added Bundle(s) Including this Product

More Like This

  • SPS
    Members: Free
    IEEE Members: $25.00
    Non-members: $40.00
  • SPS
    Members: Free
    IEEE Members: $25.00
    Non-members: $40.00
  • SPS
    Members: Free
    IEEE Members: $25.00
    Non-members: $40.00