The paper introduces the Temporal Kolmogorov-Arnold Transformer (TKAT), a novel attention-based architecture designed to capture complex temporal patterns and relationships in multivariate time series data. Inspired by the Temporal Fusion Transformer (TFT), TKAT combines the theoretical foundation of Temporal Kolmogorov-Arnold Networks (TKANs) with the power of transformers. TKAT aims to simplify complex dependencies in time series data, making them more interpretable. The architecture uses self-attention mechanisms to capture long-range dependencies and incorporates memory management through recurrent layers. The model is evaluated on a dataset of Bitcoin trading volumes, showing improved performance over standard methods and simpler models, particularly in multi-step forecasting tasks. The results highlight the importance of the overall model architecture and the effectiveness of TKAN layers in enhancing performance.The paper introduces the Temporal Kolmogorov-Arnold Transformer (TKAT), a novel attention-based architecture designed to capture complex temporal patterns and relationships in multivariate time series data. Inspired by the Temporal Fusion Transformer (TFT), TKAT combines the theoretical foundation of Temporal Kolmogorov-Arnold Networks (TKANs) with the power of transformers. TKAT aims to simplify complex dependencies in time series data, making them more interpretable. The architecture uses self-attention mechanisms to capture long-range dependencies and incorporates memory management through recurrent layers. The model is evaluated on a dataset of Bitcoin trading volumes, showing improved performance over standard methods and simpler models, particularly in multi-step forecasting tasks. The results highlight the importance of the overall model architecture and the effectiveness of TKAN layers in enhancing performance.