Published March 16, 2023 | Version v1
Conference paper Open

Graph2Feat: Inductive Link Prediction via Knowledge Distillation

  • 1. KTH Royal Institute of Technology

Description

Link prediction between two nodes is a critical task in graph ma-
chine learning. Most approaches are based on variants of graph
neural networks (GNNs) that focus on transductive link prediction
and have high inference latency. However, many real-world appli-
cations require fast inference over new nodes in inductive settings
where no information on connectivity is available for these nodes.
Thereby, node features provide an inevitable alternative in the latter
scenario. To that end, we propose Graph2Feat, which enables induc-
tive link prediction by exploiting knowledge distillation (KD) through
the Student-Teacher learning framework. In particular, Graph2Feat
learns to match the representations of a lightweight student multi-
layer perceptron (MLP) with a more expressive teacher GNN while
learning to predict missing links based on the node features, thus
attaining both GNN’s expressiveness and MLP’s fast inference. Fur-
thermore, our approach is general; it is suitable for transductive and
inductive link predictions on different types of graphs regardless of
them being homogeneous or heterogeneous, directed or undirected.
We carry out extensive experiments on seven real-world datasets
including homogeneous and heterogeneous graphs. Our experiments
demonstrate that Graph2Feat significantly outperforms SOTA meth-
ods in terms of AUC and average precision with 7.2% and 4.8%, and
44.7% and 48.0% improvements in homogeneous and heterogeneous
graphs, respectively. Finally, Graph2Feat has the minimum infer-
ence time compared to the SOTA methods, and 100x acceleration
compared to GNNs. The code and datasets are available on GitHub

Files

Graph2Feat__Inductive_Link_Prediction_via_Knowledge_Distillation.pdf

Files (598.5 kB)

Additional details

Funding

European Commission
RAIS - RAIS: Real-time Analytics for the Internet of Sports 813162