Self-Adaptive Embedding for Few-shot Classification by Hierarchical Attention
Xueliang Wang, Feng Wu, Jie Wang
-
SPS
IEEE Members: $11.00
Non-members: $15.00Length: 07:21
Few-shot classification aims to learn a model that can generalize well to new classes---that are unseen in the training phase---with a small number of labeled instances. Many existing approaches learn a shared embedding function across various tasks to measure the similarities between support (train) and query (test) samples. However, the embeddings generated by these approaches fail to take into account the feature importance of different instances and the feature correlation between support and query samples in each task. To tackle this problem, we propose a novel Self-Adaptive Embedding approach (SAE) by introducing a hierarchical attention scheme. The major novelty of SAE lies in two folds. First, SAE can effectively capture the most discriminative features at the instance level, which significantly improves its performance on downstream classification tasks. Second, SAE can adaptively adjust the representations of support and query samples by considering the feature structures shared by them at the task level. Experiments demonstrate that SAE significantly outperforms existing state-of-the-art methods.