Skip to main content

NC-WAMKD: Neighborhood Correction Weight-Adaptive Multi-teacher Knowledge Distillation For Graph-based Semi-supervised Node Classification

Jiahao Liu ( Xi’an Jiaotong University); pengcheng guo (Xi'an Jiaotong University); Yonghong Song (Xi’an Jiaotong University)

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
06 Jun 2023

Multi-teacher knowledge distillation can improve the performance of student networks in semi-supervised node classification tasks, but existing works ignore the importance of different teachers, using average of multiple teachers as final prediction. In addition, they rely on a large amount of labeled data, which is inconsistent with semi-supervised learning requirements. To solve these limitations, we propose a Neighborhood Correction Weight-Adaptive Multi-teacher Knowledge Distillation (NC-WAMKD) framework, which involves the knowledge distillation strategy WAMKD and the student model with Neighborhood Correction Label Propagation and Feature Transformation (NCLF). Specifically, WAMKD is designed to adaptively assign weights for multiple teachers to avoid misleading student by low-quality teachers. NCLF relies on neighborhood correction label propagation and feature transformation to validly predict unlabeled nodes. Experiments on semi-supervised node classification tasks demonstrate the effectiveness of the proposed framework.

More Like This

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00