TasksSotADatasetsPapersMethodsSubmitAbout
Papers With Code 2

A community resource for machine learning research: papers, code, benchmarks, and state-of-the-art results.

Explore

Notable BenchmarksAll SotADatasetsPapersMethods

Community

Submit ResultsAbout

Data sourced from the PWC Archive (CC-BY-SA 4.0). Built by the community, for the community.

Papers/EquiformerV2: Improved Equivariant Transformer for Scaling...

EquiformerV2: Improved Equivariant Transformer for Scaling to Higher-Degree Representations

Yi-Lun Liao, Brandon Wood, Abhishek Das, Tess Smidt

2023-06-21Graph Property Prediction
PaperPDFCodeCode(official)

Abstract

Equivariant Transformers such as Equiformer have demonstrated the efficacy of applying Transformers to the domain of 3D atomistic systems. However, they are limited to small degrees of equivariant representations due to their computational complexity. In this paper, we investigate whether these architectures can scale well to higher degrees. Starting from Equiformer, we first replace $SO(3)$ convolutions with eSCN convolutions to efficiently incorporate higher-degree tensors. Then, to better leverage the power of higher degrees, we propose three architectural improvements -- attention re-normalization, separable $S^2$ activation and separable layer normalization. Putting this all together, we propose EquiformerV2, which outperforms previous state-of-the-art methods on large-scale OC20 dataset by up to $9\%$ on forces, $4\%$ on energies, offers better speed-accuracy trade-offs, and $2\times$ reduction in DFT calculations needed for computing adsorption energies. Additionally, EquiformerV2 trained on only OC22 dataset outperforms GemNet-OC trained on both OC20 and OC22 datasets, achieving much better data efficiency. Finally, we compare EquiformerV2 with Equiformer on QM9 and OC20 S2EF-2M datasets to better understand the performance gain brought by higher degrees.

Results

TaskDatasetMetricValueModel
Graph Property PredictionQM9Standardized MAE0.67EquiformerV2
Graph Property PredictionQM9alpha (ma)47EquiformerV2
Graph Property PredictionQM9gap (meV)29EquiformerV2
Graph Property PredictionQM9logMAE-5.87EquiformerV2

Related Papers

Graph Positional Autoencoders as Self-supervised Learners2025-05-29Message-Passing State-Space Models: Improving Graph Learning with Modern Sequence Modeling2025-05-24GotenNet: Rethinking Efficient 3D Equivariant Graph Neural Networks2025-04-24Unlocking the Potential of Classic GNNs for Graph-level Tasks: Simple Architectures Meet Excellence2025-02-13Graph Generative Pre-trained Transformer2025-01-02Virtual Nodes Can Help: Tackling Distribution Shifts in Federated Graph Learning2024-12-26Data-Driven Self-Supervised Graph Representation Learning2024-12-24Next Level Message-Passing with Hierarchical Support Graphs2024-06-22