Skip to main content

Select the Best: Enhancing Graph Representation with Adaptive Negative Sample Selection

Xiangping Zheng (Renmin University of China); Xun Liang (Renmin University of China); Bo Wu (Renmin University of China)

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
06 Jun 2023

Graph contrastive learning (GCL) has emerged as a powerful tool to address real-world widespread label scarcity problems and has achieved impressive success in the graph learning domain. Despite the prosperous development of graph CL methods, the strategy of graph sample selection—a crucial component in CL—remains rarely explored. Furthermore, most existing methods adopt uniform negative sample schemes, like uniformly random sampling, leading to suboptimal performance. In this paper, we study the impact of negative samples on learning graph-level representations, and innovatively propose a \textbf{Rein}forcement \textbf{G}raph \textbf{C}ontrastive \textbf{L}earning (ReinGCL) for negative sample selection. To be concrete, our model consists of two major components: a graph contrastive learning framework (GCLF), and a selection distribution generator (SDG) for producing the selection probabilities based on RL. The critical insight is that ReinGCL attempts to leverage SDG to guide GCLF on how to select the negative samples in the training data, which can widen the divergence between the positive and negative samples, so as to further improve graph representation learning. Extensive experiments on multiple public datasets demonstrate that our approach is consistently better than numerous competitive baselines.

More Like This

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00