Skip to main content

Graph Fine-Grained Contrastive Representation Learning

Hui Tang, Xun Liang, Yuhui Guo, Xiangping Zheng, Bo Wu

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
    Length: 00:06:56
09 May 2022

Existing graph contrastive methods have benefited from ingenious data augmantations and mutual information estimation operations that are carefully designated to augment graph views and maximize the agreement between representations produced at the aftermost layer of two view networks. However, the design of graph CL schemes is coarse-grained and difficult to capture the universal and intrinsic properties across intermediate layers. To address this problem, we propose a novel fine-grained graph contrastive learning model (FGCL), which decomposes graph CL into global-to-local levels and disentangles the two graph views into hierarchical graphs by pooling operation to capture both global and local dependencies across views and across layers. To prevent layers mismatch and automatically assign proper hierarchical representations of the augmented graph (Key view) for each pooling layer of the original graph (Query view), we propose a sematic-aware layer allocation strategy to integrate positive guidance from diverse representations rather than a fixed layer manually. Experimental results demonstrate the advantages of our model on graph classification task. This suggests that the proposed fine-grained graph CL presents great potential for graph representation learning.

More Like This

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00