Skip to main content

Microscale Image Enhancement Via Pca and Well-Exposedness Maps

Zeynep Ovgu Yayci, Ugur Dura, Zeynep Betul Kaya, Arif E. Cetin, Mehmet Turkan

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
    Length: 00:09:23
03 Oct 2022

How to improve the generalization of CNN models has been a long-lasting problem in the deep learning community. This paper presents a runtime parameter/FLOPs-free method to strengthen CNN models by stacking linear convolution operations during training. We show that overparameterization with appropriate regularization can lead to a smooth optimization landscape that improves the performance. Concretely, we propose to add a $1\times 1$ convolutional layer before and after the original $k\times k$ convolutional layer respectively, without any non-linear activations between them. in addition, Quasi-Orthogonal Regularization is proposed to maintain the added $1\times 1$ filters as orthogonal matrixes. After training, those two $1\times 1$ layers can be fused into the original $k\times k$ layer without changing the original network architecture, leaving no extra computations at inference, i.e. parameter/FLOPs-free.

Value-Added Bundle(s) Including this Product

More Like This

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00