TasksSotADatasetsPapersMethodsSubmitAbout
Papers With Code 2

A community resource for machine learning research: papers, code, benchmarks, and state-of-the-art results.

Explore

Notable BenchmarksAll SotADatasetsPapersMethods

Community

Submit ResultsAbout

Data sourced from the PWC Archive (CC-BY-SA 4.0). Built by the community, for the community.

Papers/Equiformer: Equivariant Graph Attention Transformer for 3D...

Equiformer: Equivariant Graph Attention Transformer for 3D Atomistic Graphs

Yi-Lun Liao, Tess Smidt

2022-06-23Molecular Property PredictionGraph Property PredictionGraph AttentionInitial Structure to Relaxed Energy (IS2RE), Direct
PaperPDFCode(official)CodeCodeCode

Abstract

Despite their widespread success in various domains, Transformer networks have yet to perform well across datasets in the domain of 3D atomistic graphs such as molecules even when 3D-related inductive biases like translational invariance and rotational equivariance are considered. In this paper, we demonstrate that Transformers can generalize well to 3D atomistic graphs and present Equiformer, a graph neural network leveraging the strength of Transformer architectures and incorporating SE(3)/E(3)-equivariant features based on irreducible representations (irreps). First, we propose a simple and effective architecture by only replacing original operations in Transformers with their equivariant counterparts and including tensor products. Using equivariant operations enables encoding equivariant information in channels of irreps features without complicating graph structures. With minimal modifications to Transformers, this architecture has already achieved strong empirical results. Second, we propose a novel attention mechanism called equivariant graph attention, which improves upon typical attention in Transformers through replacing dot product attention with multi-layer perceptron attention and including non-linear message passing. With these two innovations, Equiformer achieves competitive results to previous models on QM9, MD17 and OC20 datasets.

Results

TaskDatasetMetricValueModel
Graph Property PredictionQM9Standardized MAE0.7Equiformer
Graph Property PredictionQM9alpha (ma)46Equiformer
Graph Property PredictionQM9gap (meV)30Equiformer
Graph Property PredictionQM9logMAE-5.82Equiformer

Related Papers

Catching Bid-rigging Cartels with Graph Attention Neural Networks2025-07-16Wavelet-Enhanced Neural ODE and Graph Attention for Interpretable Energy Forecasting2025-07-14Acquiring and Adapting Priors for Novel Tasks via Neural Meta-Architectures2025-07-07Combining Graph Neural Networks and Mixed Integer Linear Programming for Molecular Inference under the Two-Layered Model2025-07-05Following the Clues: Experiments on Person Re-ID using Cross-Modal Intelligence2025-07-02TRIDENT: Tri-Modal Molecular Representation Learning with Taxonomic Annotations and Local Correspondence2025-06-26Temporal-Aware Graph Attention Network for Cryptocurrency Transaction Fraud Detection2025-06-26Descriptor-based Foundation Models for Molecular Property Prediction2025-06-18