Explore Connection Pattern And Attention Mechanism For Lightweightimage Super-Resolution
Zhu Qin, Taiping Zhang
-
SPS
IEEE Members: $11.00
Non-members: $15.00Length: 00:06:42
Despite the great success of CNN (Convolutional Neural Network) in SISR (Single Image Super-Resolution), the increase in network depth leads to higher computational complexity and memory usage, which extremely hinders real-world applications. To solve this problem, we propose a lightweight cascade fusion network (CFNet) by stacking the cascade fusion block (CFB), which adopts both cascade connections and fusion connections to fully utilize the extraction ability of convolution. Specifically, Cascade connections help to transfer low-level features of source, and fusion connections help to obtain hierarchical features produced by intermediate convolution layers. To further improve the performance, we also design an efficient low-dimensional pixel attention (LPA) mechanism for SISR tasks and summarize several design guidelines. Thanks to LPA module, our CFNet improves the final reconstruction quality with little parameter cost. Extensive experimental results show that the proposed CFNet achieves a better trade-off against the state-of-the-art methods in terms of performance and model complexity.