fKAN: Fractional Kolmogorov-Arnold Networks with trainable Jacobi basis functions

fKAN: Fractional Kolmogorov-Arnold Networks with trainable Jacobi basis functions

11 Jun 2024 | Alireza Afzal Aghaei
This paper introduces the Fractional Kolmogorov-Arnold Network (fKAN), a novel neural network architecture that integrates the distinctive features of Kolmogorov-Arnold Networks (KANs) with trainable fractional-orthogonal Jacobi basis functions. The fKAN leverages the unique mathematical properties of fractional Jacobi functions, including simple derivative formulas, non-polynomial behavior, and activity for both positive and negative inputs, to achieve efficient learning and enhanced accuracy. The proposed architecture is evaluated across various tasks in deep learning and physics-informed deep learning, including synthetic regression, image classification, image denoising, and sentiment analysis. It is also tested on differential equations, including ordinary, partial, and fractional delay differential equations. Results show that integrating fractional Jacobi functions into KANs significantly improves training speed and performance across diverse applications. The fKAN uses a trainable fractional Jacobi neural block (fJNB) as its activation function, allowing the parameters α, β, and γ to be learned during training. This adaptability enhances the network's ability to capture complex patterns and improves performance. The fJNB is designed to handle inputs within a specific interval, and a bounded activation function is used to ensure the output remains within a bounded range. The fKAN is evaluated on several tasks, including regression, image classification, image denoising, and sentiment analysis, demonstrating its effectiveness and efficiency. The results show that the fKAN outperforms traditional activation functions and KANs in terms of accuracy and training speed. The fKAN is implemented using Python and Keras with TensorFlow as the backend, and the results are publicly available in a GitHub repository.This paper introduces the Fractional Kolmogorov-Arnold Network (fKAN), a novel neural network architecture that integrates the distinctive features of Kolmogorov-Arnold Networks (KANs) with trainable fractional-orthogonal Jacobi basis functions. The fKAN leverages the unique mathematical properties of fractional Jacobi functions, including simple derivative formulas, non-polynomial behavior, and activity for both positive and negative inputs, to achieve efficient learning and enhanced accuracy. The proposed architecture is evaluated across various tasks in deep learning and physics-informed deep learning, including synthetic regression, image classification, image denoising, and sentiment analysis. It is also tested on differential equations, including ordinary, partial, and fractional delay differential equations. Results show that integrating fractional Jacobi functions into KANs significantly improves training speed and performance across diverse applications. The fKAN uses a trainable fractional Jacobi neural block (fJNB) as its activation function, allowing the parameters α, β, and γ to be learned during training. This adaptability enhances the network's ability to capture complex patterns and improves performance. The fJNB is designed to handle inputs within a specific interval, and a bounded activation function is used to ensure the output remains within a bounded range. The fKAN is evaluated on several tasks, including regression, image classification, image denoising, and sentiment analysis, demonstrating its effectiveness and efficiency. The results show that the fKAN outperforms traditional activation functions and KANs in terms of accuracy and training speed. The fKAN is implemented using Python and Keras with TensorFlow as the backend, and the results are publicly available in a GitHub repository.
Reach us at info@study.space