Skip to main content
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
    Length: 00:10:24
10 May 2022

As an essential task for natural language understanding, slot filling aims to identify the contiguous spans of specific slots in an utterance. In real-world applications, the labeling costs of utterances may be expensive, and transfer learning techniques have been developed to ease this problem. However, cross-domain slot filling could significantly suffer from negative transfer due to non-targeted or zero-shot slots. Originally, this paper explores several ways to measure transferability across slot filling domains and finds that the shared slot number could serve as an efficient and effective estimator. First, this frustratingly easy measure requires no training data and is efficient to calculate. Second, it guides us heuristically select source domains that contain more shared slots with the target domain, which obtains SOTA results on Snips benchmark. Third, a dynamic transfer procedure based on this estimator clearly shows the negative transfer in cross-domain slot filling. We finally explore a source-free scene that we could only obtain black-box source models and propose to weight source domains based on prediction entropy.