Unsupervised Change Detection For Multimodal Remote Sensing Images Via Coupled Dictionary Learning And Sparse Coding
Vinicius Ferraris, Nicolas Dobigeon, Yanna Cavalcanti, Thomas Oberlin, Marie Chabert
-
SPS
IEEE Members: $11.00
Non-members: $15.00Length: 14:31
Archetypal scenarios for change detection generally consider two images acquired through sensors of the same modality. The resolution dissimilarity is often bypassed though a simple preprocessing, applied independently on each image to bring them to the same resolution. However, in some important situations, e.g. a natural disaster, the only images available may be those acquired through sensors of different modalities and resolutions. Therefore, it is mandatory to develop general and robust methods able to deal with this unfavorable situation. This paper proposes a coupled dictionary learning strategy to detect changes between two images with different modalities and possibly different spatial and/or spectral resolutions. The pair of observed images is modelled as a sparse linear combination of atoms belonging to a pair of coupled overcomplete dictionaries learnt from the two observed images. Codes are expected to be globally similar for areas not affected by the changes while, in some spatially sparse locations, they are expected to be different. Change detection is then envisioned as an inverse problem, namely estimation of a dual code such that the difference between the estimated codes associated with each image exhibits spatial sparsity. A comparison with state-of-the-art change detection methods evidences the proposed method superiority.