Skip to main content

Knowledge Transferred Fine-Tuning For Anti-Aliased Convolutional Neural Network In Data-Limited Situation

Satoshi Suzuki, Shoichiro Takeda, Ryuichi Tanida, Hideaki Kimata, Hayaru Shouno

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
    Length: 00:07:12
21 Sep 2021

Anti-aliased convolutional neural networks~(CNNs) introduce blur filters to intermediate representations in CNNs to achieve high accuracy. A promising way to build a new anti-aliased CNN is to fine-tune a pre-trained CNN, which can easily be found online, with blur filters. However, blur filters drastically degrade the pre-trained representation, so the fine-tuning needs to rebuild the representation by using massive training data. Therefore, if the training data is limited, the fine-tuning cannot work well because it induces overfitting to the limited training data. To tackle this problem, this paper proposes ``knowledge transferred fine-tuning.'' On the basis of the idea of knowledge transfer, our method transfers the knowledge from intermediate representations in the pre-trained CNN to the anti-aliased CNN while fine-tuning. We transfer only essential knowledge using a pixel-level loss that transfers detailed knowledge and a global-level loss that transfers coarse knowledge. Experimental results demonstrate that our method significantly outperforms the simple fine-tuning method.

Value-Added Bundle(s) Including this Product

More Like This

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00