MQTransformer: Multi-Horizon Forecasts with Context Dependent and Feedback-Aware Attention

Recent advances in neural forecasting have produced major improvements in accuracy for probabilistic demand prediction. In this work, we propose novel improvements to the current state of the art by incorporating changes inspired by recent advances in Transformer architectures for Natural Language Processing... (read more)

PDF Abstract ICLR 2021 PDF (under review) ICLR 2021 Abstract (under review)
No code implementations yet. Submit your code now

Tasks


Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods used in the Paper