TasksSotADatasetsPapersMethodsSubmitAbout
Papers With Code 2

A community resource for machine learning research: papers, code, benchmarks, and state-of-the-art results.

Explore

Notable BenchmarksAll SotADatasetsPapersMethods

Community

Submit ResultsAbout

Data sourced from the PWC Archive (CC-BY-SA 4.0). Built by the community, for the community.

Papers/Connecting the Dots: Multivariate Time Series Forecasting ...

Connecting the Dots: Multivariate Time Series Forecasting with Graph Neural Networks

Zonghan Wu, Shirui Pan, Guodong Long, Jing Jiang, Xiaojun Chang, Chengqi Zhang

2020-05-24Traffic PredictionTime Series ForecastingGraph LearningUnivariate Time Series ForecastingTime SeriesMultivariate Time Series ForecastingTime Series Analysis
PaperPDFCodeCode(official)Code

Abstract

Modeling multivariate time series has long been a subject that has attracted researchers from a diverse range of fields including economics, finance, and traffic. A basic assumption behind multivariate time series forecasting is that its variables depend on one another but, upon looking closely, it is fair to say that existing methods fail to fully exploit latent spatial dependencies between pairs of variables. In recent years, meanwhile, graph neural networks (GNNs) have shown high capability in handling relational dependencies. GNNs require well-defined graph structures for information propagation which means they cannot be applied directly for multivariate time series where the dependencies are not known in advance. In this paper, we propose a general graph neural network framework designed specifically for multivariate time series data. Our approach automatically extracts the uni-directed relations among variables through a graph learning module, into which external knowledge like variable attributes can be easily integrated. A novel mix-hop propagation layer and a dilated inception layer are further proposed to capture the spatial and temporal dependencies within the time series. The graph learning, graph convolution, and temporal convolution modules are jointly learned in an end-to-end framework. Experimental results show that our proposed model outperforms the state-of-the-art baseline methods on 3 of 4 benchmark datasets and achieves on-par performance with other approaches on two traffic datasets which provide extra structural information.

Results

TaskDatasetMetricValueModel
Time Series ForecastingElectricityRRSE0.0745MTGNN (3 step)
Time Series ForecastingElectricityRRSE0.0878MTGNN (6 step)
Time Series ForecastingElectricityRRSE0.0916MTGNN (12 step)
Time Series ForecastingElectricityRRSE0.0953MTGNN (24 step)
Traffic PredictionEXPY-TKY1 step MAE5.86MTGNN
Traffic PredictionEXPY-TKY3 step MAE6.49MTGNN
Traffic PredictionEXPY-TKY6 step MAE6.81MTGNN
Traffic PredictionNE-BJ12 steps MAE4.9MTGNN
Time Series AnalysisElectricityRRSE0.0745MTGNN (3 step)
Time Series AnalysisElectricityRRSE0.0878MTGNN (6 step)
Time Series AnalysisElectricityRRSE0.0916MTGNN (12 step)
Time Series AnalysisElectricityRRSE0.0953MTGNN (24 step)

Related Papers

The Power of Architecture: Deep Dive into Transformer Architectures for Long-Term Time Series Forecasting2025-07-17SGCL: Unifying Self-Supervised and Supervised Learning for Graph Recommendation2025-07-17MoTM: Towards a Foundation Model for Time Series Imputation based on Continuous Modeling2025-07-17Emergence of Functionally Differentiated Structures via Mutual Information Optimization in Recurrent Neural Networks2025-07-17Data Augmentation in Time Series Forecasting through Inverted Framework2025-07-15A Graph-in-Graph Learning Framework for Drug-Target Interaction Prediction2025-07-15D3FL: Data Distribution and Detrending for Robust Federated Learning in Non-linear Time-series Data2025-07-15Graph World Model2025-07-14