Skip to main content
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
    Length: 13:49
04 May 2020

Graph Recurrent Neural Networks (GRNNs) are a neural network architecture devised to learn from graph processes, which are time sequences of graph signals. Similarly to traditional recurrent neural networks, GRNNs experience the problem of vanishing/exploding gradients when learning long term causal dependencies. When these dependencies do not depend on the graph, this issue is solved by the addition of time gates (long short-term memory architectures). However, in graph processes long term dependencies are directly influenced by the graph structure, and can be stronger or weaker across certain node paths. To address this, we propose two spatial gating strategies for GRNNs leveraging the node and edge structure of the graph. Node and edge-gated GRNNs are shown to outperform other GRNNs in a synthetic as well as a real-world problem of earthquake epicenter prediction.

Value-Added Bundle(s) Including this Product

More Like This

  • SPS
    Members: $150.00
    IEEE Members: $250.00
    Non-members: $350.00
  • SPS
    Members: $150.00
    IEEE Members: $250.00
    Non-members: $350.00
  • SPS
    Members: $150.00
    IEEE Members: $250.00
    Non-members: $350.00