DUAL META CALIBRATION MIX FOR IMPROVING GENERALIZATION IN META-LEARNING
Ze-Yu Mi (Nanjing university); Yu-Bin Yang (State Key Laboratory for Novel Software Technology, Nanjing University)
-
SPS
IEEE Members: $11.00
Non-members: $15.00
Meta-learning has achieved remarkable success as a powerful
paradigm for transferring learned knowledge from previous
tasks. However, the lack of a large number of diverse and
quality tasks is the bottleneck of current meta-learning, which
can easily lead to overfitting and therefore seriously hurt the
generalization ability. In this paper, to address this challenge,
we proposed Dual Meta Calibration Mix (DMCM) to improve
the diversity and quality of tasks with ”more data”.
Concretely, we designed dual augmentation framework and
meta calibration mix. The dual augmentation framework augments
individual tasks and pairs of tasks by linearly combining
samples and labels from both support and query sets, respectively.
The meta calibration mix generates new samples
by linearly combining image patches and corresponding labels
based on the calibrated mixing matrix and calibrated label.
Extensive experiments show that our proposed method
significantly improves the generalization of meta-learning algorithms
and consistently outperforms other state-of-the-art
regularization meta-learning methods.