23 Jul 2024 | Eric A. F. Reinhardt, P. R. Dinesh, Sergei Gleyzer
This paper introduces SineKAN, a Kolmogorov-Arnold Network (KAN) that replaces the basis spline (B-Spline) activation functions with re-weighted sine functions. The SineKAN model achieves better or comparable numerical performance to B-Spline KAN models on the MNIST benchmark, while also providing a substantial speed increase, up to 4-8 times faster. The SineKAN architecture uses learnable sine functions with grid-based phase shifts and frequencies, which allows for efficient and stable model training and inference. The model's performance is evaluated on the MNIST dataset, where it outperforms B-SplineKAN in terms of accuracy and inference speed. The SineKAN model also demonstrates better scaling with larger batch sizes and deeper networks. The paper also discusses the universal approximation properties of the SineKAN model and its potential as a viable alternative to B-SplineKAN in the development of Kolmogorov-Arnold networks. The SineKAN model is implemented with a grid-based phase shift strategy that ensures consistent model performance across different grid sizes and depths. The model's performance is further validated through extensive experiments on various network configurations and batch sizes. The results show that SineKAN outperforms B-SplineKAN in terms of inference speed and accuracy, making it a promising candidate for use in deep learning applications. The paper concludes that SineKAN is a viable alternative to B-SplineKAN in the development of Kolmogorov-Arnold networks, with potential for further exploration in other KAN implementations.This paper introduces SineKAN, a Kolmogorov-Arnold Network (KAN) that replaces the basis spline (B-Spline) activation functions with re-weighted sine functions. The SineKAN model achieves better or comparable numerical performance to B-Spline KAN models on the MNIST benchmark, while also providing a substantial speed increase, up to 4-8 times faster. The SineKAN architecture uses learnable sine functions with grid-based phase shifts and frequencies, which allows for efficient and stable model training and inference. The model's performance is evaluated on the MNIST dataset, where it outperforms B-SplineKAN in terms of accuracy and inference speed. The SineKAN model also demonstrates better scaling with larger batch sizes and deeper networks. The paper also discusses the universal approximation properties of the SineKAN model and its potential as a viable alternative to B-SplineKAN in the development of Kolmogorov-Arnold networks. The SineKAN model is implemented with a grid-based phase shift strategy that ensures consistent model performance across different grid sizes and depths. The model's performance is further validated through extensive experiments on various network configurations and batch sizes. The results show that SineKAN outperforms B-SplineKAN in terms of inference speed and accuracy, making it a promising candidate for use in deep learning applications. The paper concludes that SineKAN is a viable alternative to B-SplineKAN in the development of Kolmogorov-Arnold networks, with potential for further exploration in other KAN implementations.