Unsupervised Ensemble Distillation For Multi-Organ Segmentation
Lefei Zhang, Shixiang Feng, Yu Wang, Yan-Feng Wang, Ya Zhang, Xin Chen, Qi Tian
-
SPS
IEEE Members: $11.00
Non-members: $15.00Length: 00:07:52
Multi-organ segmentation is a fundamental task in medical image processing. This paper explores a novel privacy friendly setting to train a multi-organ segmentation model, \emph{i.e.,} learning directly from multiple pre-trained single-organ segmentation models. We formulate it into a special unsupervised ensemble distillation problem. To solve the problem, a multi-teacher knowledge distillation framework is proposed, which leverages the soft labels predicted by pre-trained teacher models to train a student model, \textit{i.e.}, the multi-organ segmentation model. Considering the difference of the teachers in task speciality, the pseudo labels for each organ come from the corresponding teacher, and those of the background region are from all teachers. To deal with the problem of unmatched dimensionality between the teacher models and the student model, an output transformation method is introduced. The entire learning process requires only access to pre-trained models and a reasonable set of unsupervised target data, achieving a good balance between the privacy protection and model performance.Experimental results on a widely adopted benchmark dataset have shown the promise of the proposed method.