Cross-View Attention Network For Breast Cancer Screening From Multi-View Mammograms
Xuran Zhao, Luyang Yu, Xun Wang
-
SPS
IEEE Members: $11.00
Non-members: $15.00Length: 12:25
In this paper, we address the problem of breast caner detection from multi-view mammograms. We present a novel cross-view attention module (CvAM) which implicitly learns to focus on the cancer-related local abnormal regions and highlighting salient features by exploring cross-view information among four views of a screening mammography exam, e.g. asymmetries between left and right breasts and lesion correspondence between two views of the same breast. More specifically, the proposed CvAM calculates spatial attention maps based on the same view of different breasts to enhance bilateral asymmetric regions, and channel attention maps based on two different views of the same breast to enhance the feature channels corresponding to the same lesion in a single breast. CvAMs can be easily integrated into standard convolutional neural networks (CNN) architectures such as ResNet to form a multi-view classification model. Experiments are conducted on DDSM dataset, and results show that CvAMs can not only provide better classification accuracy over non-attention and single-view attention models, but also demonstrate better abnormality localization power using CNN visualization tools.