Skip to main content

Meta Learning for Domain Agnostic Soft Prompt

Ming-Yen Chen (National Yang Ming Chiao Tung University); Mahdin Rohmatillah (National Yang Ming Chiao Tung University); Ching-hsien Lee (Industrial Technology Research Institute); Jen-Tzung Chien (National Yang Ming Chiao Tung University)

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
07 Jun 2023

The prompt-based learning, as used in GPT-3, has become a popular approach to extract knowledge from a powerful pre-trained language model (PLM) for natural language understanding tasks. However, either applying the hard prompt for sentences by defining a collection of human-engineering prompt templates or directly optimizing the soft or continuous prompt with labeled data may not really generalize well for unseen domain data. To cope with this issue, this paper presents a new prompt-based unsupervised domain adaptation where the learned soft prompt is able to boost the frozen pre-trained language model to deal with the input tokens from unseen domains. Importantly, the meta learning and optimization is developed to carry out the domain agnostic soft prompt where the loss for masked language model is minimized. The experiments on multi-domain natural language understanding tasks show the merits of the proposed method.

More Like This

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00