CDHD: CONTRASTIVE DREAMER FOR HINT DISTILLATION
yu le (Tsinghua University); Hua TongYan (Guangdong Bright Dream Robotics Co., Ltd.); Wenming Yang (Tsinghua University); Ye Peng (Guangdong Bright Dream Robotics Co., Ltd.); Qingmin Liao (Tsinghua Univeristy)
-
SPS
IEEE Members: $11.00
Non-members: $15.00
Replaying previous training data is the most effective approach
for Class-Incremental Learning (CIL), with its performance
bounded by data availability. Therefore, many recent
studies consider the Data-Free Class-Incremental Learning
(DFCIL) problem that requires no previous data. However,
the existing methods do not consider synthesising data of
heterogeneity, thus limiting models’ generalizability. Such
homogenous images further hinder the knowledge distillation
process when regularising only the deeper layers close to
the output, resulting in catastrophic forgetting. To address
these issues, we present CDHD: a contrastive dreamer for
hint distillation. Our approach starts with training a generator
for data synthesis. A model inversion technique is introduced
to obtain a generator capable of producing heterogeneous
images from the classifier by imposing the ContRastive Loss.
Moreover, to better transfer the previous knowledge to the
current model, we force the teacher network to provide more
general knowledge to its students by enforcing the Hint Loss
in shallower layers rather than only in deeper ones. We validate
the performance of CDHD on CIFAR-100 for various
tasks and compare it against the SOTA baseline for DFCIL,
demonstrating our superiorities and thus constituting a new
benchmark.