A Review for Pre-Trained Transformer-Based Time Series Forecasting Models
2023 IEEE 64th International Scientific Conference on Information Technology and Management Science of Riga Technical University (ITMS 2023): Proceedings 2023
Yunus Emre Midilli, Sergejs Paršutins

Transformer-based models have proven their superiority against recurrent networks in time series forecasting. Enhancing transformer-based forecasting models via pretraining tasks is a novel approach in the literature. In this paper, we are reviewing the most recent papers about pretraining aspects of time series as well as pretraining tasks that are used in transformer-based architectures.


Keywords
contrastive learning | forecasting | masked auto-encoder | pretraining | transformer
DOI
10.1109/ITMS59786.2023.10317721
Hyperlink
https://ieeexplore.ieee.org/document/10317721

Midilli, Y., Paršutins, S. A Review for Pre-Trained Transformer-Based Time Series Forecasting Models. In: 2023 IEEE 64th International Scientific Conference on Information Technology and Management Science of Riga Technical University (ITMS 2023): Proceedings, Latvia, Riga, 5-6 October, 2023. Piscataway: IEEE, 2023, pp.1-6. ISBN 979-8-3503-7030-0. e-ISBN 979-8-3503-7029-4. ISSN 2771-6953. e-ISSN 2771-6937. Available from: doi:10.1109/ITMS59786.2023.10317721

Publication language
English (en)
The Scientific Library of the Riga Technical University.
E-mail: uzzinas@rtu.lv; Phone: +371 28399196