ICEL: Learning with Inconsistent Explanations
Biao Liu (Southern University of Science and Technology); xiaoyu wu (Huawei); Bo Yuan (Southern University of Science and Technology)
-
SPS
IEEE Members: $11.00
Non-members: $15.00
Generating the heatmaps is one of the explanation methods to show what regions the model use to predict in vision tasks. GradCAM is a popular approach to provide such heatmaps. However, GradCAM is post-hoc, and its heatmaps are not always meeting human annotations. Inspired by CGC (Contrastive GradCAM consistency), we propose ICEL (InConsistent Explanation Learning) method which introduces inconsistent explanation loss measured by cosine similarity on heatmaps. We show that our method can preserve classification accuracy, while the heatmaps generated by explanation methods are more consistent with human annotations, and the computational complexity is reduced from $O(n^2)$ to $O(n)$, compared with CGC.