Skip to main content
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
    Length: 07:21
07 Jul 2020

Few-shot classification aims to learn a model that can generalize well to new classes---that are unseen in the training phase---with a small number of labeled instances. Many existing approaches learn a shared embedding function across various tasks to measure the similarities between support (train) and query (test) samples. However, the embeddings generated by these approaches fail to take into account the feature importance of different instances and the feature correlation between support and query samples in each task. To tackle this problem, we propose a novel Self-Adaptive Embedding approach (SAE) by introducing a hierarchical attention scheme. The major novelty of SAE lies in two folds. First, SAE can effectively capture the most discriminative features at the instance level, which significantly improves its performance on downstream classification tasks. Second, SAE can adaptively adjust the representations of support and query samples by considering the feature structures shared by them at the task level. Experiments demonstrate that SAE significantly outperforms existing state-of-the-art methods.

Value-Added Bundle(s) Including this Product

More Like This

  • SPS
    Members: $150.00
    IEEE Members: $250.00
    Non-members: $350.00
  • SPS
    Members: $150.00
    IEEE Members: $250.00
    Non-members: $350.00
  • SPS
    Members: $150.00
    IEEE Members: $250.00
    Non-members: $350.00