TasksSotADatasetsPapersMethodsSubmitAbout
Papers With Code 2

A community resource for machine learning research: papers, code, benchmarks, and state-of-the-art results.

Explore

Notable BenchmarksAll SotADatasetsPapersMethods

Community

Submit ResultsAbout

Data sourced from the PWC Archive (CC-BY-SA 4.0). Built by the community, for the community.

Papers/Enhancing Topological Dependencies in Spatio-Temporal Grap...

Enhancing Topological Dependencies in Spatio-Temporal Graphs with Cycle Message Passing Blocks

Minho Lee, Yun Young Choi, Sun Woo Park, Seunghwan Lee, Joohwan Ko, Jaeyoung Hong

2024-01-29Spatio-Temporal ForecastingTraffic PredictionTime Series Prediction
PaperPDFCode(official)

Abstract

Graph Neural Networks (GNNs) and Transformer-based models have been increasingly adopted to learn the complex vector representations of spatio-temporal graphs, capturing intricate spatio-temporal dependencies crucial for applications such as traffic datasets. Although many existing methods utilize multi-head attention mechanisms and message-passing neural networks (MPNNs) to capture both spatial and temporal relations, these approaches encode temporal and spatial relations independently, and reflect the graph's topological characteristics in a limited manner. In this work, we introduce the Cycle to Mixer (Cy2Mixer), a novel spatio-temporal GNN based on topological non-trivial invariants of spatio-temporal graphs with gated multi-layer perceptrons (gMLP). The Cy2Mixer is composed of three blocks based on MLPs: A temporal block for capturing temporal properties, a message-passing block for encapsulating spatial information, and a cycle message-passing block for enriching topological information through cyclic subgraphs. We bolster the effectiveness of Cy2Mixer with mathematical evidence emphasizing that our cycle message-passing block is capable of offering differentiated information to the deep learning model compared to the message-passing block. Furthermore, empirical evaluations substantiate the efficacy of the Cy2Mixer, demonstrating state-of-the-art performances across various spatio-temporal benchmark datasets. The source code is available at \url{https://github.com/leemingo/cy2mixer}.

Results

TaskDatasetMetricValueModel
Traffic PredictionPeMS07MAE@1h19.45Cy2Mixer
Traffic PredictionPeMS08MAE@1h13.53Cy2Mixer
Traffic PredictionPeMS0412 Steps MAE18.14Cy2Mixer

Related Papers

Wavelet-Enhanced Neural ODE and Graph Attention for Interpretable Energy Forecasting2025-07-14Federated Learning with Graph-Based Aggregation for Traffic Forecasting2025-07-13Predicting Graph Structure via Adapted Flux Balance Analysis2025-07-08SEED: A Structural Encoder for Embedding-Driven Decoding in Time Series Prediction with LLMs2025-06-25A Study of Dynamic Stock Relationship Modeling and S&P500 Price Forecasting Based on Differential Graph Transformer2025-06-23MS-TVNet:A Long-Term Time Series Prediction Method Based on Multi-Scale Dynamic Convolution2025-06-08Neural MJD: Neural Non-Stationary Merton Jump Diffusion for Time Series Prediction2025-06-05Beyond Attention: Learning Spatio-Temporal Dynamics with Emergent Interpretable Topologies2025-06-01