Skip to main content

LONG-SHORT ATTENTION NETWORK FOR THE SPECTRAL SUPER-RESOLUTION OF MULTISPECTRAL IMAGES

Kai Zhang (Shandong Normal University); Tian Jin (Shandong Normal University); Feng Zhang (Shandong Normal University); Jiande Sun (Shandong Normal University)

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
06 Jun 2023

Owing to the efficiency in terms of the modeling of longrange dependencies, transformer-based spectral reconstruction methods have produced satisfactory hyperspectral (HS) images from multispectral (MS) images. Some transformerbased methods applied self-attention to all bands in the HS image to model the relationships among them, which ignore high correlations between adjacent bands and low correlations among nonadjacent ones. To learn the global relationships among all bands and the correlations between adjacent bands simultaneously, this paper proposes a long-short attention network (LSA-Net) for the spectral super-resolution of MS images. Specifically, LSA-Net is composed of cascaded longrange attention blocks and short-range attention blocks. In long-range attention blocks, the transformer is imposed on all channels by modeling each channel as a token. Then, grouped channels are fed into short-range attention blocks for correlation learning, which is inferred from the similarities among neighboring channels. With the introduction of long- and short-range attention, the relationships among spectral bands can be preserved better. Experiments on the CAVE dataset demonstrate the effectiveness of the proposed LSA-Net. The code is available at https://github.com/RSMagneto/LSA-Net.

More Like This

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00