Skip to main content

RETHINKING RANDOM WALK IN GRAPH REPRESENTATION LEARNING

DingYi Zeng (University of Electronic Science and Technology of China); Wenyu Chen (University of Electronic Science and Technology of China); Wanlong Liu (University of Electronic Science and Technology of China); Li Zhou (University of Electronic Science and Technology of China); Hong Qu (University of Electronic Science and Technology of China)

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
07 Jun 2023

With the help of deep learning, Graph Neural Networks (GNNs) have achieved remarkable progress in various fields. However, due to the limitation of the message passing mechanism of GNNs, there exists an upper limit on its expressiveness. Some high-order GNNs have achieved good results in expressiveness, but they also have shortcomings in complexity and real-world performance. In this paper, we attempt to provide a graph neural network architecture that simultaneously addresses expressiveness, complexity and real-world performance. To this end, we propose Spatially constrained Random walk diffusion structural Encoding (SRE) to encode structural information and can be used for any GNN under our architecture. Our extensive and diverse experiments on datasets of different types and sizes demonstrate the superior expressiveness and state-of-the-art performance of our architecture on real-world tasks.

More Like This

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00