TasksSotADatasetsPapersMethodsSubmitAbout
Papers With Code 2

A community resource for machine learning research: papers, code, benchmarks, and state-of-the-art results.

Explore

Notable BenchmarksAll SotADatasetsPapersMethods

Community

Submit ResultsAbout

Data sourced from the PWC Archive (CC-BY-SA 4.0). Built by the community, for the community.

Papers/DiffWire: Inductive Graph Rewiring via the Lovász Bound

DiffWire: Inductive Graph Rewiring via the Lovász Bound

Adrian Arnaiz-Rodriguez, Ahmed Begga, Francisco Escolano, Nuria Oliver

2022-06-15Graph ClassificationNode Classification
PaperPDFCodeCode(official)

Abstract

Graph Neural Networks (GNNs) have been shown to achieve competitive results to tackle graph-related tasks, such as node and graph classification, link prediction and node and graph clustering in a variety of domains. Most GNNs use a message passing framework and hence are called MPNNs. Despite their promising results, MPNNs have been reported to suffer from over-smoothing, over-squashing and under-reaching. Graph rewiring and graph pooling have been proposed in the literature as solutions to address these limitations. However, most state-of-the-art graph rewiring methods fail to preserve the global topology of the graph, are neither differentiable nor inductive, and require the tuning of hyper-parameters. In this paper, we propose DiffWire, a novel framework for graph rewiring in MPNNs that is principled, fully differentiable and parameter-free by leveraging the Lov\'asz bound. The proposed approach provides a unified theory for graph rewiring by proposing two new, complementary layers in MPNNs: CT-Layer, a layer that learns the commute times and uses them as a relevance function for edge re-weighting; and GAP-Layer, a layer to optimize the spectral gap, depending on the nature of the network and the task at hand. We empirically validate the value of each of these layers separately with benchmark datasets for graph classification. We also perform preliminary studies on the use of CT-Layer for homophilic and heterophilic node classification tasks. DiffWire brings together the learnability of commute times to related definitions of curvature, opening the door to creating more expressive MPNNs.

Results

TaskDatasetMetricValueModel
Graph ClassificationIMDB-BINARYAccuracy69.93GAP-Layer (Rcut)
Graph ClassificationIMDB-BINARYAccuracy69.84CT-Layer
Graph ClassificationIMDB-BINARYAccuracy68.8GAP-Layer (Ncut)
Graph ClassificationREDDIT-BINARYAccuracy78.45CT-Layer
Graph ClassificationREDDIT-BINARYAccuracy77.63GAP-Layer (Rcut)
Graph ClassificationREDDIT-BINARYAccuracy77.17DiffWire
Graph ClassificationREDDIT-BINARYAccuracy76GAP-Layer (Ncut)
Node ClassificationWisconsinAccuracy79.05CT-Layer
Node ClassificationWisconsinAccuracy69.25CT-Layer (PE)
Node ClassificationCornellAccuracy69.04CT-Layer
Node ClassificationCornellAccuracy58.02CT-Layer (PE)
Node ClassificationCiteseerAccuracy72.26CT-Layer (PE)
Node ClassificationCiteseerAccuracy66.71CT-Layer
Node ClassificationPubmedAccuracy86.07CT-Layer (PE)
Node ClassificationPubmedAccuracy68.19CT-Layer
Node ClassificationActorAccuracy31.98CT-Layer
Node ClassificationActorAccuracy29.35CT-Layer (PE)
ClassificationIMDB-BINARYAccuracy69.93GAP-Layer (Rcut)
ClassificationIMDB-BINARYAccuracy69.84CT-Layer
ClassificationIMDB-BINARYAccuracy68.8GAP-Layer (Ncut)
ClassificationREDDIT-BINARYAccuracy78.45CT-Layer
ClassificationREDDIT-BINARYAccuracy77.63GAP-Layer (Rcut)
ClassificationREDDIT-BINARYAccuracy77.17DiffWire
ClassificationREDDIT-BINARYAccuracy76GAP-Layer (Ncut)

Related Papers

Demystifying Distributed Training of Graph Neural Networks for Link Prediction2025-06-25Equivariance Everywhere All At Once: A Recipe for Graph Foundation Models2025-06-17Density-aware Walks for Coordinated Campaign Detection2025-06-16Delving into Instance-Dependent Label Noise in Graph Data: A Comprehensive Study and Benchmark2025-06-14Graph Semi-Supervised Learning for Point Classification on Data Manifolds2025-06-13Devil's Hand: Data Poisoning Attacks to Locally Private Graph Learning Protocols2025-06-11Wasserstein Hypergraph Neural Network2025-06-11Positional Encoding meets Persistent Homology on Graphs2025-06-06