Informer is a transformer-based model designed for long sequence time-series forecasting (LSTF). It addresses the challenges of quadratic time complexity, high memory usage, and inefficiency in handling long sequences. The model introduces three key components: (1) ProbSparse self-attention, which reduces time and memory complexity to O(L log L) while maintaining performance; (2) self-attention distilling, which enhances the efficiency of processing long sequences by focusing on dominant attention scores; and (3) a generative-style decoder that predicts long sequences in one forward pass, improving inference speed. Extensive experiments on four large-scale datasets show that Informer significantly outperforms existing methods in LSTF tasks. The model's efficiency and performance make it a promising solution for long-term time-series forecasting.Informer is a transformer-based model designed for long sequence time-series forecasting (LSTF). It addresses the challenges of quadratic time complexity, high memory usage, and inefficiency in handling long sequences. The model introduces three key components: (1) ProbSparse self-attention, which reduces time and memory complexity to O(L log L) while maintaining performance; (2) self-attention distilling, which enhances the efficiency of processing long sequences by focusing on dominant attention scores; and (3) a generative-style decoder that predicts long sequences in one forward pass, improving inference speed. Extensive experiments on four large-scale datasets show that Informer significantly outperforms existing methods in LSTF tasks. The model's efficiency and performance make it a promising solution for long-term time-series forecasting.