Subgraph Representation Learning With Hard Negative Samples for Inductive Link Prediction
Heeyoung Kwak, Hyunkyung Bae, Kyomin Jung
-
SPS
IEEE Members: $11.00
Non-members: $15.00Length: 00:04:37
The inductive link prediction in knowledge graphs (KGs) is often addressed to induce logical rules that capture entity-independent relational semantics. Recent studies suggest graph representation learning to encode these logical rules within the local subgraph structures. With this approach, the model can have the inductive ability to cope with unseen entities, which is practical for evolving nature of real-world KGs. However, despite the importance of a high-quality negative sample in link prediction, there is currently no method for selecting hard negatives for inductive link prediction. To overcome this limitation, we propose a new sampling method for selecting hard negative samples given a positive triplet. We also propose Subgraph Infomax (SGI), a novel inductive link prediction model, with a newly-proposed training objective that maximizes the mutual information (MI) between the target relation and the enclosing subgraph. We select hard negative samples by using the pre-trained MI estimator of SGI. The model is then fine-tuned using the selected hard negative samples. Empirically, we demonstrate superior performances of our model on multiple datasets of the inductive KGC benchmark, showing the enhanced connectivity between the target relation embedding and the subgraph representation.