Skip to main content
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
    Length: 15:20
27 Oct 2020

Fine-tuning is an effective transfer learning method to achieve ideal performance on target task with limited training data. Some recent works regularize parameters of deep neural networks for better knowledge transfer. However, these methods enforce homogeneous penalties for all parameters, resulting in catastrophic forgetting or negative transfer. To address this problem, we propose a novel Inhomogeneous Regularization (IR) method that imposes a strong regularization on parameters of transferable convolutional filters to tackle catastrophic forgetting and alleviate the regularization on parameters of less transferable filters to tackle negative transfer. Moreover, we use the decaying averaged deviation of parameters from the start point (pre-trained parameters) to accurately measure the transferability of each filter. Evaluation on the three challenging benchmarks datasets has demonstrated the superiority of the proposed model against state-of-the-art methods.

Value-Added Bundle(s) Including this Product

More Like This

  • SPS
    Members: $150.00
    IEEE Members: $250.00
    Non-members: $350.00
  • SPS
    Members: $150.00
    IEEE Members: $250.00
    Non-members: $350.00
  • SPS
    Members: $150.00
    IEEE Members: $250.00
    Non-members: $350.00