STANet: Spatiotemporal Adaptive Network For Remote Sensing Images
Chenlu Hu, Mengting Ma, Xiaowen Ma, Huanting Zhang, Dun Wu, Guang Gao, Wei Zhang
-
SPS
IEEE Members: $11.00
Non-members: $15.00
Spatiotemporal fusion aims to generate remote sensing images with high spatial and temporal resolutions. Conventional spatiotemporal fusion methods usually use convolution operations for feature extraction, which limits their capability of capturing long-range dependencies. Meanwhile, the significant difference of spatial resolutions of images brings great difficulty to the reconstruction of detailed textures. To address these issues, we propose a GAN-based multi-stage spatiotemporal adaptive network (STANet) for remote sensing images using temporal feature refinement and spatial texture transfer. In particular, we design a temporal interaction module (TIM) to extract useful information on surface changes over time, using a cross-temporal gating mechanism that emphasizes feature changes throughout the task. We employ the adaptive instance normalization (AdaIN) layers to learn the global spatial correlation via texture transfer from the fine image to the coarse image. Experiments on two datasets show that the proposed method outperforms other state-of-the-art methods in several metrics.