This paper introduces the Temporal Kolmogorov-Arnold Transformer (TKAT), a novel attention-based architecture for time series forecasting. Inspired by the Temporal Fusion Transformer (TFT), TKAT integrates the theoretical foundation of the Kolmogorov-Arnold representation with the power of transformers. The model is designed to handle tasks where the observed part of the features is more important than the a priori known part. TKAT aims to simplify complex dependencies in time series, making them more interpretable. The use of transformer architecture allows the model to capture long-range dependencies through self-attention mechanisms.
TKAT is built upon Temporal Kolmogorov-Arnold Networks (TKANs), which combine the power of Kolmogorov-Arnold Networks (KANs) with memory management for time dependency. The model uses RKANs for short-term memory and a cell state. TKAT layers are capable of detecting temporal dependencies in time series data. The model is adapted for problems where known inputs are not in the majority, such as financial tasks.
The architecture includes encoder-decoder components, gated residual networks, and variable selection networks. The encoder processes past inputs, while the decoder uses the encoder's final cell state as an initial state. The model uses multi-head attention and self-attention mechanisms to capture long-term relationships between different time steps.
The model is evaluated on a dataset of cryptocurrency trading volumes, with the task of predicting the notional amount traded on the market over several steps ahead. The model outperforms other architectures, particularly in multi-step forecasting tasks. The results show that TKAT achieves higher performance than standard methods, demonstrating the effectiveness of the proposed architecture. The model's performance is measured using R-squared and root mean square error metrics. The results indicate that the architecture is more effective in capturing long-term dependencies and improving forecasting accuracy. The model is implemented in Python and can be installed using pip.This paper introduces the Temporal Kolmogorov-Arnold Transformer (TKAT), a novel attention-based architecture for time series forecasting. Inspired by the Temporal Fusion Transformer (TFT), TKAT integrates the theoretical foundation of the Kolmogorov-Arnold representation with the power of transformers. The model is designed to handle tasks where the observed part of the features is more important than the a priori known part. TKAT aims to simplify complex dependencies in time series, making them more interpretable. The use of transformer architecture allows the model to capture long-range dependencies through self-attention mechanisms.
TKAT is built upon Temporal Kolmogorov-Arnold Networks (TKANs), which combine the power of Kolmogorov-Arnold Networks (KANs) with memory management for time dependency. The model uses RKANs for short-term memory and a cell state. TKAT layers are capable of detecting temporal dependencies in time series data. The model is adapted for problems where known inputs are not in the majority, such as financial tasks.
The architecture includes encoder-decoder components, gated residual networks, and variable selection networks. The encoder processes past inputs, while the decoder uses the encoder's final cell state as an initial state. The model uses multi-head attention and self-attention mechanisms to capture long-term relationships between different time steps.
The model is evaluated on a dataset of cryptocurrency trading volumes, with the task of predicting the notional amount traded on the market over several steps ahead. The model outperforms other architectures, particularly in multi-step forecasting tasks. The results show that TKAT achieves higher performance than standard methods, demonstrating the effectiveness of the proposed architecture. The model's performance is measured using R-squared and root mean square error metrics. The results indicate that the architecture is more effective in capturing long-term dependencies and improving forecasting accuracy. The model is implemented in Python and can be installed using pip.