MOMENT: A Family of Open Time-series Foundation Models

MOMENT: A Family of Open Time-series Foundation Models

2024 | Mononito Goswami, Konrad Szafer, Arjun Choudhry, Yifu Cai, Shuo Li, Artur Dubrawski
**MOMENT: A Family of Open Time-series Foundation Models** Mononito Goswami, Konrad Szafer, Arjun Choudhry, Yifu Cai, Shuo Li, Artur Dubrawski **Abstract:** We introduce MOMENT, a family of open-source foundation models for general-purpose time series analysis. Pre-training large models on time series data is challenging due to the lack of a cohesive public repository and diverse time series characteristics. We address these challenges by compiling the Time Series Pile, a large collection of diverse public time series data, and designing a benchmark to evaluate models in limited supervision settings. MOMENT models are pre-trained using masked time series prediction and can be fine-tuned for various tasks such as forecasting, classification, anomaly detection, and imputation. Experiments demonstrate the effectiveness of MOMENT with minimal data and task-specific fine-tuning. We also explore the properties of these models, including their ability to capture intuitive time series characteristics and cross-modal transfer learning capabilities. **Key Contributions:** 1. **Pre-training Data:** We compiled the Time Series Pile, a diverse collection of over 5 public time series databases from various domains. 2. **Multi-dataset Pre-training:** We addressed the challenge of multi-dataset pre-training by training MOMENT models on a large corpus of unlabeled time series data. 3. **Evaluation Benchmark:** We designed a benchmark to evaluate MOMENT models on diverse tasks and datasets in limited supervision settings. **Experimental Setup:** - **Model Architecture:** MOMENT models are high-capacity transformers that break time series into disjoint patches and use a masked prediction task for pre-training. - **Pre-training Setup:** We pre-trained MOMENT models using a masked time series prediction task, focusing on minimizing reconstruction error. - **Fine-tuning Settings:** MOMENT can be fine-tuned end-to-end or used in zero-shot and few-shot settings. **Results:** - **Long-horizon Forecasting:** MOMENT achieved near state-of-the-art performance on most datasets and horizons. - **Zero-shot Short-horizon Forecasting:** MOMENT outperformed statistical methods in some datasets. - **Classification:** MOMENT learned distinct representations for different classes without task-specific fine-tuning. - **Anomaly Detection:** MOMENT consistently outperformed other models in both zero-shot and linear probing configurations. - **Imputation:** MOMENT achieved the lowest reconstruction error on all ETT datasets. **Conclusion:** MOMENT is the first open-source family of time series foundation models, addressing key challenges in pre-training and evaluation. Our experiments show that MOMENT is effective for multiple time series analysis tasks in limited supervision settings, with superior performance in anomaly detection and classification problems. We also observed interesting empirical observations about time series foundation models, including their ability to capture intuitive time series characteristics and cross-modal transfer learning capabilities.**MOMENT: A Family of Open Time-series Foundation Models** Mononito Goswami, Konrad Szafer, Arjun Choudhry, Yifu Cai, Shuo Li, Artur Dubrawski **Abstract:** We introduce MOMENT, a family of open-source foundation models for general-purpose time series analysis. Pre-training large models on time series data is challenging due to the lack of a cohesive public repository and diverse time series characteristics. We address these challenges by compiling the Time Series Pile, a large collection of diverse public time series data, and designing a benchmark to evaluate models in limited supervision settings. MOMENT models are pre-trained using masked time series prediction and can be fine-tuned for various tasks such as forecasting, classification, anomaly detection, and imputation. Experiments demonstrate the effectiveness of MOMENT with minimal data and task-specific fine-tuning. We also explore the properties of these models, including their ability to capture intuitive time series characteristics and cross-modal transfer learning capabilities. **Key Contributions:** 1. **Pre-training Data:** We compiled the Time Series Pile, a diverse collection of over 5 public time series databases from various domains. 2. **Multi-dataset Pre-training:** We addressed the challenge of multi-dataset pre-training by training MOMENT models on a large corpus of unlabeled time series data. 3. **Evaluation Benchmark:** We designed a benchmark to evaluate MOMENT models on diverse tasks and datasets in limited supervision settings. **Experimental Setup:** - **Model Architecture:** MOMENT models are high-capacity transformers that break time series into disjoint patches and use a masked prediction task for pre-training. - **Pre-training Setup:** We pre-trained MOMENT models using a masked time series prediction task, focusing on minimizing reconstruction error. - **Fine-tuning Settings:** MOMENT can be fine-tuned end-to-end or used in zero-shot and few-shot settings. **Results:** - **Long-horizon Forecasting:** MOMENT achieved near state-of-the-art performance on most datasets and horizons. - **Zero-shot Short-horizon Forecasting:** MOMENT outperformed statistical methods in some datasets. - **Classification:** MOMENT learned distinct representations for different classes without task-specific fine-tuning. - **Anomaly Detection:** MOMENT consistently outperformed other models in both zero-shot and linear probing configurations. - **Imputation:** MOMENT achieved the lowest reconstruction error on all ETT datasets. **Conclusion:** MOMENT is the first open-source family of time series foundation models, addressing key challenges in pre-training and evaluation. Our experiments show that MOMENT is effective for multiple time series analysis tasks in limited supervision settings, with superior performance in anomaly detection and classification problems. We also observed interesting empirical observations about time series foundation models, including their ability to capture intuitive time series characteristics and cross-modal transfer learning capabilities.
Reach us at info@study.space
Understanding MOMENT%3A A Family of Open Time-series Foundation Models