Distribution Free Prediction Sets for Node Classification
Abstract
Graph Neural Networks (GNNs) are able to achieve high classification accuracy on many important real world datasets, but provide no rigorous notion of predictive uncertainty. Quantifying the confidence of GNN models is difficult due to the dependence between datapoints induced by the graph structure. We leverage recent advances in <PRE_TAG>conformal prediction</POST_TAG> to construct prediction sets for <PRE_TAG>node classification</POST_TAG> in <PRE_TAG>inductive learning</POST_TAG> scenarios. We do this by taking an existing approach for conformal classification that relies on exchangeable data and modifying it by appropriately weighting the conformal scores to reflect the network structure. We show through experiments on standard <PRE_TAG>benchmark datasets</POST_TAG> using popular GNN models that our approach provides tighter and better calibrated <PRE_TAG>prediction sets</POST_TAG> than a naive application of <PRE_TAG>conformal prediction</POST_TAG>.
Models citing this paper 0
No model linking this paper
Datasets citing this paper 0
No dataset linking this paper
Spaces citing this paper 0
No Space linking this paper
Collections including this paper 0
No Collection including this paper