fKAN: Fractional Kolmogorov-Arnold Networks with trainable Jacobi basis functions

fKAN: Fractional Kolmogorov-Arnold Networks with trainable Jacobi basis functions

11 Jun 2024 | Alireza Afzal Aghaei
The paper introduces the Fractional Kolmogorov-Arnold Network (fKAN), a novel neural network architecture that integrates the unique attributes of Kolmogorov-Arnold Networks (KANs) with trainable adaptive fractional-orthogonal Jacobi functions as its basis function. The fKAN leverages the mathematical properties of fractional Jacobi functions, such as simple derivative formulas and non-polynomial behavior, to enhance learning efficiency and accuracy. The architecture is evaluated across various tasks in deep learning and physics-informed deep learning, including synthetic regression, image classification, image denoising, and sentiment analysis. Additionally, the performance is assessed on differential equations, including ordinary, partial, and fractional delay differential equations. The results demonstrate that integrating fractional Jacobi functions into KANs significantly improves training speed and performance across diverse fields and applications. The paper also explores the characteristics of Jacobi polynomials and their suitability as activation functions in neural networks, providing a detailed methodology for the proposed fKAN and experimental validation.The paper introduces the Fractional Kolmogorov-Arnold Network (fKAN), a novel neural network architecture that integrates the unique attributes of Kolmogorov-Arnold Networks (KANs) with trainable adaptive fractional-orthogonal Jacobi functions as its basis function. The fKAN leverages the mathematical properties of fractional Jacobi functions, such as simple derivative formulas and non-polynomial behavior, to enhance learning efficiency and accuracy. The architecture is evaluated across various tasks in deep learning and physics-informed deep learning, including synthetic regression, image classification, image denoising, and sentiment analysis. Additionally, the performance is assessed on differential equations, including ordinary, partial, and fractional delay differential equations. The results demonstrate that integrating fractional Jacobi functions into KANs significantly improves training speed and performance across diverse fields and applications. The paper also explores the characteristics of Jacobi polynomials and their suitability as activation functions in neural networks, providing a detailed methodology for the proposed fKAN and experimental validation.
Reach us at info@study.space
[slides and audio] fKAN%3A Fractional Kolmogorov-Arnold Networks with trainable Jacobi basis functions