TasksSotADatasetsPapersMethodsSubmitAbout
Papers With Code 2

A community resource for machine learning research: papers, code, benchmarks, and state-of-the-art results.

Explore

Notable BenchmarksAll SotADatasetsPapersMethods

Community

Submit ResultsAbout

Data sourced from the PWC Archive (CC-BY-SA 4.0). Built by the community, for the community.

Papers/Unified Training of Universal Time Series Forecasting Tran...

Unified Training of Universal Time Series Forecasting Transformers

Gerald Woo, Chenghao Liu, Akshat Kumar, Caiming Xiong, Silvio Savarese, Doyen Sahoo

2024-02-04Time Series ForecastingTime Series
PaperPDFCode(official)

Abstract

Deep learning for time series forecasting has traditionally operated within a one-model-per-dataset framework, limiting its potential to leverage the game-changing impact of large pre-trained models. The concept of universal forecasting, emerging from pre-training on a vast collection of time series datasets, envisions a single Large Time Series Model capable of addressing diverse downstream forecasting tasks. However, constructing such a model poses unique challenges specific to time series data: i) cross-frequency learning, ii) accommodating an arbitrary number of variates for multivariate time series, and iii) addressing the varying distributional properties inherent in large-scale data. To address these challenges, we present novel enhancements to the conventional time series Transformer architecture, resulting in our proposed Masked Encoder-based Universal Time Series Forecasting Transformer (Moirai). Trained on our newly introduced Large-scale Open Time Series Archive (LOTSA) featuring over 27B observations across nine domains, Moirai achieves competitive or superior performance as a zero-shot forecaster when compared to full-shot models. Code, data, and model weights can be found at https://github.com/SalesforceAIResearch/uni2ts.

Results

TaskDatasetMetricValueModel
Time Series ForecastingETTh1 (336) MultivariateMAE0.429MOIRAISmall
Time Series ForecastingETTh1 (336) MultivariateMSE0.412MOIRAISmall
Time Series ForecastingETTh1 (336) MultivariateMAE0.45MOIRAIBase
Time Series ForecastingETTh1 (336) MultivariateMSE0.456MOIRAIBase
Time Series ForecastingETTh1 (336) MultivariateMAE0.474MOIRAILarge
Time Series ForecastingETTh1 (336) MultivariateMSE0.514MOIRAILarge
Time Series AnalysisETTh1 (336) MultivariateMAE0.429MOIRAISmall
Time Series AnalysisETTh1 (336) MultivariateMSE0.412MOIRAISmall
Time Series AnalysisETTh1 (336) MultivariateMAE0.45MOIRAIBase
Time Series AnalysisETTh1 (336) MultivariateMSE0.456MOIRAIBase
Time Series AnalysisETTh1 (336) MultivariateMAE0.474MOIRAILarge
Time Series AnalysisETTh1 (336) MultivariateMSE0.514MOIRAILarge

Related Papers

The Power of Architecture: Deep Dive into Transformer Architectures for Long-Term Time Series Forecasting2025-07-17MoTM: Towards a Foundation Model for Time Series Imputation based on Continuous Modeling2025-07-17Data Augmentation in Time Series Forecasting through Inverted Framework2025-07-15D3FL: Data Distribution and Detrending for Robust Federated Learning in Non-linear Time-series Data2025-07-15Towards Interpretable Time Series Foundation Models2025-07-10MoFE-Time: Mixture of Frequency Domain Experts for Time-Series Forecasting Models2025-07-09Foundation models for time series forecasting: Application in conformal prediction2025-07-09Bridging the Last Mile of Prediction: Enhancing Time Series Forecasting with Conditional Guided Flow Matching2025-07-09