Graph-Adaptive Activation Functions For Graph Neural Networks
Bianca Iancu,Luana Ruiz,Alejandro Ribeiro,Elvin Isufi
-
SPS
IEEE Members: $11.00
Non-members: $15.00Length: 13:52
Activation functions are crucial in graph neural networks (GNNs) as they allow capturing the relationship between the input graph data and their representations. We propose activation functions for GNNs that adapt to the graph, and are also distributable. To incorporate the feature-topology coupling, nonlinearized nodal features are combined with trainable parameters in a form akin to graph convolutions. This leads to a graph-adaptive trainable nonlinear component of the GNN that can be implemented directly or via kernel transformations, thus, enriching the class of functions to represent the network data. We show permutation equivariance is always preserved and prove the graph-adaptive max nonlinearities are Lipschitz stable to input perturbations. Numerical experiments with source localization, finite-time consensus, distributed regression, and recommender systems confirm our findings and show improved performance compared with pointwise and state-of-the-art localized nonlinearities.