TasksSotADatasetsPapersMethodsSubmitAbout
Papers With Code 2

A community resource for machine learning research: papers, code, benchmarks, and state-of-the-art results.

Explore

Notable BenchmarksAll SotADatasetsPapersMethods

Community

Submit ResultsAbout

Data sourced from the PWC Archive (CC-BY-SA 4.0). Built by the community, for the community.

Papers/Earthformer: Exploring Space-Time Transformers for Earth S...

Earthformer: Exploring Space-Time Transformers for Earth System Forecasting

Zhihan Gao, Xingjian Shi, Hao Wang, Yi Zhu, Yuyang Wang, Mu Li, Dit-yan Yeung

2022-07-12Weather ForecastingEarth Surface Forecasting
PaperPDFCodeCode(official)

Abstract

Conventionally, Earth system (e.g., weather and climate) forecasting relies on numerical simulation with complex physical models and are hence both expensive in computation and demanding on domain expertise. With the explosive growth of the spatiotemporal Earth observation data in the past decade, data-driven models that apply Deep Learning (DL) are demonstrating impressive potential for various Earth system forecasting tasks. The Transformer as an emerging DL architecture, despite its broad success in other domains, has limited adoption in this area. In this paper, we propose Earthformer, a space-time Transformer for Earth system forecasting. Earthformer is based on a generic, flexible and efficient space-time attention block, named Cuboid Attention. The idea is to decompose the data into cuboids and apply cuboid-level self-attention in parallel. These cuboids are further connected with a collection of global vectors. We conduct experiments on the MovingMNIST dataset and a newly proposed chaotic N-body MNIST dataset to verify the effectiveness of cuboid attention and figure out the best design of Earthformer. Experiments on two real-world benchmarks about precipitation nowcasting and El Nino/Southern Oscillation (ENSO) forecasting show Earthformer achieves state-of-the-art performance. Code is available: https://github.com/amazon-science/earth-forecasting-transformer .

Results

TaskDatasetMetricValueModel
VideoEarthNet2021 IID TrackEarthNetScore0.3425Earthformer
VideoEarthNet2021 OOD TrackEarthNetScore0.3252Earthformer
Video PredictionEarthNet2021 IID TrackEarthNetScore0.3425Earthformer
Video PredictionEarthNet2021 OOD TrackEarthNetScore0.3252Earthformer
Weather ForecastingSEVIRMSE3.6957Earthformer
Weather ForecastingSEVIRmCSI0.4419Earthformer
Weather ForecastingSEVIRMSE3.7532ConvLSTM
Weather ForecastingSEVIRmCSI0.4185ConvLSTM

Related Papers

FourCastNet 3: A geometric approach to probabilistic machine-learning weather forecasting at scale2025-07-16D3FL: Data Distribution and Detrending for Robust Federated Learning in Non-linear Time-series Data2025-07-15Optimising 4th-Order Runge-Kutta Methods: A Dynamic Heuristic Approach for Efficiency and Low Storage2025-06-26Distributed Cross-Channel Hierarchical Aggregation for Foundation Models2025-06-26Elucidated Rolling Diffusion Models for Probabilistic Weather Forecasting2025-06-24Skillful joint probabilistic weather forecasting from marginals2025-06-12A multi-scale loss formulation for learning a probabilistic model with proper score optimisation2025-06-12AtmosMJ: Revisiting Gating Mechanism for AI Weather Forecasting Beyond the Year Scale2025-06-11