Skip to main content
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
    Pages/Slides: 42
24 Aug 2022

Recently, deep convolutional neural network (CNNs) have been widely used in Single Image Super-Resolution (SISR) and have obtained great success. However, most of the existing methods are limited to local receptive field and equal treatment of different types of information; existing methods cannot effectively aggregate hierarchical feature information. To address these issues, we propose an attention cube network (A-CubeNet). Specifically, the adaptive spatial attention branch (ASAB) and the adaptive channel attention branch (ACAB) constitute the adaptive dual attention module (ADAM), which can capture the long-range spatial and channel-wise contextual information to expand the receptive field and distinguish different types of information. Furthermore, the adaptive hierarchical attention module (AHAM) can capture the long-range hierarchical contextual information to flexibly aggregate different feature maps depending on the global context.
Moreover, Non-Local Attention (NLA) brings significant improvement for SISR by leveraging intrinsic feature correlation in natural images. However, NLA gives noisy information large weights and consumes quadratic computation resources with respect to the input size, limiting its performance and application. Therefore, we propose a novel Efficient Non-Local Contrastive Attention (ENLCA), specifically, ENLCA consists of two parts: Efficient Non-Local Attention (ENLA) and Sparse Aggregation. ENLA adopts the kernel method to approximate exponential function and obtains linear computation complexity. For Sparse Aggregation, it makes the network focus on informative features and applies contrastive learning to further separate relevant and irrelevant features.Experiments demonstrate the superiority of the above two methods over state-of-the-art approaches in both quantitative comparison and visual analysis.

More Like This

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00