TasksSotADatasetsPapersMethodsSubmitAbout
Papers With Code 2

A community resource for machine learning research: papers, code, benchmarks, and state-of-the-art results.

Explore

Notable BenchmarksAll SotADatasetsPapersMethods

Community

Submit ResultsAbout

Data sourced from the PWC Archive (CC-BY-SA 4.0). Built by the community, for the community.

Methods/MPNN

MPNN

Message Passing Neural Network

GraphsIntroduced 200074 papers
Source Paper

Description

There are at least eight notable examples of models from the literature that can be described using the Message Passing Neural Networks (MPNN) framework. For simplicity we describe MPNNs which operate on undirected graphs GGG with node features xvx_{v}xv​ and edge features evwe_{vw}evw​. It is trivial to extend the formalism to directed multigraphs. The forward pass has two phases, a message passing phase and a readout phase. The message passing phase runs for TTT time steps and is defined in terms of message functions MtM_{t}Mt​ and vertex update functions UtU_{t}Ut​. During the message passing phase, hidden states hvth_{v}^{t}hvt​ at each node in the graph are updated based on messages mvt+1m_{v}^{t+1}mvt+1​ according to

mvt+1=∑w∈N(v)Mt(hvt,hwt,evw)m_{v}^{t+1} = \sum_{w \in N(v)} M_{t}(h_{v}^{t}, h_{w}^{t}, e_{vw})mvt+1​=w∈N(v)∑​Mt​(hvt​,hwt​,evw​) hvt+1=Ut(hvt,mvt+1)h_{v}^{t+1} = U_{t}(h_{v}^{t}, m_{v}^{t+1})hvt+1​=Ut​(hvt​,mvt+1​)

where in the sum, N(v)N(v)N(v) denotes the neighbors of vvv in graph GGG. The readout phase computes a feature vector for the whole graph using some readout function RRR according to

y^=R(hvT∣v∈G)\hat{y} = R(\\{ h_{v}^{T} | v \in G \\})y^​=R(hvT​∣v∈G)

The message functions MtM_{t}Mt​, vertex update functions UtU_{t}Ut​, and readout function RRR are all learned differentiable functions. RRR operates on the set of node states and must be invariant to permutations of the node states in order for the MPNN to be invariant to graph isomorphism.

Papers Using This Method

Understanding Generalization in Node and Link Prediction2025-07-01Exploring Graph-Transformer Out-of-Distribution Generalization Abilities2025-06-25Evaluating Loss Functions for Graph Neural Networks: Towards Pretraining and Generalization2025-06-17WILTing Trees: Interpreting the Distance Between MPNN Embeddings2025-05-30Improving the Effective Receptive Field of Message-Passing Neural Networks2025-05-29Uncertainty Estimation for Heterophilic Graphs Through the Lens of Information Theory2025-05-28Open the Eyes of MPNN: Vision Enhances MPNN in Link Prediction2025-05-13Efficient Parallelization of Message Passing Neural Network Potentials for Large-scale Molecular Dynamics2025-05-10A Materials Map Integrating Experimental and Computational Data via Graph-Based Machine Learning for Enhanced Materials Discovery2025-03-10Towards graph neural networks for provably solving convex optimization problems2025-02-04A Message Passing Neural Network Surrogate Model for Bond-Associated Peridynamic Material Correspondence Formulation2024-10-29Causal GNNs: A GNN-Driven Instrumental Variable Approach for Causal Inference in Networks2024-09-13MCU-MixQ: A HW/SW Co-optimized Mixed-precision Neural Network Design Framework for MCUs2024-07-17Commute Graph Neural Networks2024-06-30Next Level Message-Passing with Hierarchical Support Graphs2024-06-22Beyond 5G Network Failure Classification for Network Digital Twin Using Graph Neural Network2024-06-06Graph Neural Networks Approach for Joint Wireless Power Control and Spectrum Allocation2024-06-03Bundle Neural Networks for message diffusion on graphs2024-05-24Scene Graph Generation Strategy with Co-occurrence Knowledge and Learnable Term Frequency2024-05-21Could Chemical LLMs benefit from Message Passing2024-05-14