TasksSotADatasetsPapersMethodsSubmitAbout
Papers With Code 2

A community resource for machine learning research: papers, code, benchmarks, and state-of-the-art results.

Explore

Notable BenchmarksAll SotADatasetsPapersMethods

Community

Submit ResultsAbout

Data sourced from the PWC Archive (CC-BY-SA 4.0). Built by the community, for the community.

Papers/Spatial-Temporal-Decoupled Masked Pre-training for Spatiot...

Spatial-Temporal-Decoupled Masked Pre-training for Spatiotemporal Forecasting

Haotian Gao, Renhe Jiang, Zheng Dong, Jinliang Deng, Yuxin Ma, Xuan Song

2023-12-01Traffic PredictionTime Series
PaperPDFCode(official)

Abstract

Spatiotemporal forecasting techniques are significant for various domains such as transportation, energy, and weather. Accurate prediction of spatiotemporal series remains challenging due to the complex spatiotemporal heterogeneity. In particular, current end-to-end models are limited by input length and thus often fall into spatiotemporal mirage, i.e., similar input time series followed by dissimilar future values and vice versa. To address these problems, we propose a novel self-supervised pre-training framework Spatial-Temporal-Decoupled Masked Pre-training (STD-MAE) that employs two decoupled masked autoencoders to reconstruct spatiotemporal series along the spatial and temporal dimensions. Rich-context representations learned through such reconstruction could be seamlessly integrated by downstream predictors with arbitrary architectures to augment their performances. A series of quantitative and qualitative evaluations on six widely used benchmarks (PEMS03, PEMS04, PEMS07, PEMS08, METR-LA, and PEMS-BAY) are conducted to validate the state-of-the-art performance of STD-MAE. Codes are available at https://github.com/Jimmy-7664/STD-MAE.

Results

TaskDatasetMetricValueModel
Traffic PredictionPeMSD7(M)12 steps MAE2.52STD-MAE
Traffic PredictionPeMSD7(M)12 steps MAPE6.35STD-MAE
Traffic PredictionPeMSD7(M)12 steps RMSE5.2STD-MAE
Traffic PredictionEXPY-TKY1 step MAE5.73STD-MAE
Traffic PredictionEXPY-TKY3 step MAE6.41STD-MAE
Traffic PredictionEXPY-TKY6 step MAE6.75STD-MAE
Traffic PredictionPeMS07MAE@1h18.31STD-MAE
Traffic PredictionPeMSD412 steps MAE17.8STD-MAE
Traffic PredictionPeMSD7(L)12 steps MAE2.64STD-MAE
Traffic PredictionPeMSD7(L)12 steps MAPE6.65STD-MAE
Traffic PredictionPeMSD7(L)12 steps RMSE5.5STD-MAE
Traffic PredictionPeMSD812 steps MAE13.44STD-MAE
Traffic PredictionPeMSD812 steps MAPE8.76STD-MAE
Traffic PredictionPeMSD812 steps RMSE22.47STD-MAE
Traffic PredictionPeMSD712 steps MAE18.31STD-MAE
Traffic PredictionPeMSD712 steps MAPE7.72STD-MAE
Traffic PredictionPeMSD712 steps RMSE31.07STD-MAE
Traffic PredictionPEMS-BAYMAE @ 12 step1.77STD-MAE
Traffic PredictionPEMS-BAYRMSE4.2STD-MAE
Traffic PredictionMETR-LA12 steps MAE3.4STD-MAE
Traffic PredictionMETR-LA12 steps MAPE9.59STD-MAE
Traffic PredictionMETR-LA12 steps RMSE7.07STD-MAE
Traffic PredictionMETR-LAMAE @ 12 step3.4STD-MAE
Traffic PredictionMETR-LAMAE @ 3 step2.62STD-MAE
Traffic PredictionPeMSD312 steps MAE13.8STD-MAE
Traffic PredictionPeMSD312 steps MAPE13.96STD-MAE
Traffic PredictionPeMSD312 steps RMSE24.43STD-MAE
Traffic PredictionPeMS0412 Steps MAE17.8STD-MAE

Related Papers

MoTM: Towards a Foundation Model for Time Series Imputation based on Continuous Modeling2025-07-17The Power of Architecture: Deep Dive into Transformer Architectures for Long-Term Time Series Forecasting2025-07-17Data Augmentation in Time Series Forecasting through Inverted Framework2025-07-15D3FL: Data Distribution and Detrending for Robust Federated Learning in Non-linear Time-series Data2025-07-15Federated Learning with Graph-Based Aggregation for Traffic Forecasting2025-07-13Towards Interpretable Time Series Foundation Models2025-07-10MoFE-Time: Mixture of Frequency Domain Experts for Time-Series Forecasting Models2025-07-09Foundation models for time series forecasting: Application in conformal prediction2025-07-09