CLMFormer: Mitigating Data Redundancy to Revitalize Transformer-based Long-Term Time Series Forecasting System

16 Jul 2022  ·  Mingjie Li, Rui Liu, Guangsi Shi, Mingfei Han, Changling Li, Lina Yao, Xiaojun Chang, Ling Chen ·

Long-term time-series forecasting (LTSF) plays a crucial role in various practical applications. Transformer and its variants have become the de facto backbone for LTSF, offering exceptional capabilities in processing long sequence data. However, existing Transformer-based models, such as Fedformer and Informer, often achieve their best performances on validation sets after just a few epochs, indicating potential underutilization of the Transformer's capacity. One of the reasons that contribute to this overfitting is data redundancy arising from the rolling forecasting settings in the data augmentation process, particularly evident in longer sequences with highly similar adjacent data. In this paper, we propose a novel approach to address this issue by employing curriculum learning and introducing a memory-driven decoder. Specifically, we progressively introduce Bernoulli noise to the training samples, which effectively breaks the high similarity between adjacent data points. To further enhance forecasting accuracy, we introduce a memory-driven decoder. This component enables the model to capture seasonal tendencies and dependencies in the time-series data and leverages temporal relationships to facilitate the forecasting process. The experimental results on six real-life LTSF benchmarks demonstrate that our approach can be seamlessly plugged into varying Transformer-based models, with our approach enhancing the LTSF performances of various Transformer-based models by maximally 30%.

PDF Abstract

Datasets


Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods