STAEformer: Spatio-Temporal Adaptive Embedding Makes Vanilla Transformer SOTA for Traffic Forecasting

21 Aug 2023  ยท  Hangchen Liu, Zheng Dong, Renhe Jiang, Jiewen Deng, Jinliang Deng, Quanjun Chen, Xuan Song ยท

With the rapid development of the Intelligent Transportation System (ITS), accurate traffic forecasting has emerged as a critical challenge. The key bottleneck lies in capturing the intricate spatio-temporal traffic patterns. In recent years, numerous neural networks with complicated architectures have been proposed to address this issue. However, the advancements in network architectures have encountered diminishing performance gains. In this study, we present a novel component called spatio-temporal adaptive embedding that can yield outstanding results with vanilla transformers. Our proposed Spatio-Temporal Adaptive Embedding transformer (STAEformer) achieves state-of-the-art performance on five real-world traffic forecasting datasets. Further experiments demonstrate that spatio-temporal adaptive embedding plays a crucial role in traffic forecasting by effectively capturing intrinsic spatio-temporal relations and chronological information in traffic time series.

PDF Abstract

Results from the Paper


Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Traffic Prediction METR-LA STAEformer MAE @ 12 step 3.34 # 4
MAE @ 3 step 2.65 # 5
Traffic Prediction PeMS04 STAEformer 12 Steps MAE 18.22 # 3
Traffic Prediction PeMS07 STAEformer MAE@1h 19.14 # 2
Traffic Prediction PeMS08 STAEformer MAE@1h 13.46 # 2
Traffic Prediction PEMS-BAY STAEformer MAE @ 12 step 1.91 # 8
Traffic Prediction PeMSD7 STAEformer 12 steps MAE 19.14 # 1
12 steps RMSE 32.60 # 3
12 steps MAPE 8.01 # 3

Methods


No methods listed for this paper. Add relevant methods here