Unified Training of Universal Time Series Forecasting Transformers

Unified Training of Universal Time Series Forecasting Transformers

2024 | Gerald Woo, Chenghao Liu, Akshat Kumar, Caiming Xiong, Silvio Savarese, Doyen Sahoo
This paper introduces MOIRAI, a masked encoder-based universal time series forecasting Transformer designed to address the challenges of universal forecasting. The model is trained on the Large-scale Open Time Series Archive (LOTSA), a large collection of open time series datasets spanning nine domains with over 27 billion observations. MOIRAI achieves competitive or superior performance as a zero-shot forecaster compared to full-shot models. The model is designed to handle varying frequencies, arbitrary numbers of variates, and diverse distributional properties of time series data. MOIRAI incorporates novel enhancements to the conventional time series Transformer architecture, including multi patch size projection layers, Any-variate Attention, and a mixture distribution for probabilistic forecasting. The multi patch size projection layers allow the model to handle different frequencies by using varying patch sizes. Any-variate Attention enables the model to handle arbitrary numbers of variates by encoding variate indices through binary attention biases. The mixture distribution allows the model to adapt to different distributional properties of time series data. The paper also presents the LOTSA dataset, which is the largest collection of open time series datasets available for pre-training time series forecasting models. The dataset is designed to support a wide range of domains and is used to train MOIRAI. The model is evaluated on both in-distribution and out-of-distribution forecasting tasks, demonstrating its effectiveness in handling diverse time series forecasting scenarios. The results show that MOIRAI achieves strong performance in both in-distribution and out-of-distribution forecasting, with competitive or superior results compared to full-shot models. The model is also effective in probabilistic forecasting and long sequence forecasting tasks. The paper concludes that MOIRAI represents a significant advancement in the field of time series forecasting, offering a flexible and powerful solution for universal forecasting.This paper introduces MOIRAI, a masked encoder-based universal time series forecasting Transformer designed to address the challenges of universal forecasting. The model is trained on the Large-scale Open Time Series Archive (LOTSA), a large collection of open time series datasets spanning nine domains with over 27 billion observations. MOIRAI achieves competitive or superior performance as a zero-shot forecaster compared to full-shot models. The model is designed to handle varying frequencies, arbitrary numbers of variates, and diverse distributional properties of time series data. MOIRAI incorporates novel enhancements to the conventional time series Transformer architecture, including multi patch size projection layers, Any-variate Attention, and a mixture distribution for probabilistic forecasting. The multi patch size projection layers allow the model to handle different frequencies by using varying patch sizes. Any-variate Attention enables the model to handle arbitrary numbers of variates by encoding variate indices through binary attention biases. The mixture distribution allows the model to adapt to different distributional properties of time series data. The paper also presents the LOTSA dataset, which is the largest collection of open time series datasets available for pre-training time series forecasting models. The dataset is designed to support a wide range of domains and is used to train MOIRAI. The model is evaluated on both in-distribution and out-of-distribution forecasting tasks, demonstrating its effectiveness in handling diverse time series forecasting scenarios. The results show that MOIRAI achieves strong performance in both in-distribution and out-of-distribution forecasting, with competitive or superior results compared to full-shot models. The model is also effective in probabilistic forecasting and long sequence forecasting tasks. The paper concludes that MOIRAI represents a significant advancement in the field of time series forecasting, offering a flexible and powerful solution for universal forecasting.
Reach us at info@study.space
[slides and audio] Unified Training of Universal Time Series Forecasting Transformers