Skip to main content
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
    Length: 00:10:36
08 May 2022

Graph convolutional networks (GCNs) have achieved impressive performance in learning from graph-structured data. Although GCN and its variants have shown promising results, they continue to remain shallow as their performance drops with an increasing number of layers - a problem popularly known as oversmoothing. This work introduces a simple yet effective idea of feature gating over graph convolution layers to facilitate deeper graph neural networks and address oversmoothing. The proposed feature gating is easy to implement without changing the underlying network architecture and is broadly applicable to GCN and almost any of its variants. Further, we demonstrate the use of feature gating in assigning importance to node features and the nodes for the node classification task. Quantitative analysis on real-world datasets shows that feature gating paves the way for constructing deeper GCNs.

More Like This

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00