U-Shiftformer: brain tumor segmentation using a shifted attention mechanism
Chih-Wei Lin (Fujian Agriculture and Forestry University); Zhongsheng Chen (Fujian Agriculture and Forestry University)
-
SPS
IEEE Members: $11.00
Non-members: $15.00
In this study, we proposed a network structure based on the shifted attention mechanism, namely U-Shiftformer, to overcome the drawback of existing convolution neural networks (CNNs) in brain tumor segmentation that lacks multimodal information interaction. The U-Shiftformer takes the U-shape encoder-decoder structure as the backbone and embeds the proposed Shiftformer module to exchange the information between modalities in the downsampling process. The Shiftformer module contains one standard attention, and three proposed shifted attention modules, in which the shifted attention module considers the information exchange by constructing the relationship between adjacent modalities. In the experiments, we compare the proposed U-Shiftformer with SOTA networks in the dice, precision, and Hausdorff metrics. Its average accuracies of these metrics surpass all the comparison networks and achieve 0.8424, 0.8675, 0.9244, and 1.2961 in dice, precision, sensitivity, and Hausdorff metrics, respectively.