Neural Operators with Localized Integral and Differential Kernels

Neural Operators with Localized Integral and Differential Kernels

2024 | Miguel Liu-Schiavffini, Julius Berner, Boris Bonev, Thorsten Kurth, Kamyar Azizzadenesheli, Anima Anandkumar
This paper introduces a novel framework for local neural operators that can capture local features in function spaces. The proposed approach combines differential and integral operators with locally supported kernels to enable the learning of operators that can be applied at any resolution. The key idea is to derive differential operators from convolutional layers by appropriately scaling the kernel values, and to derive local integral operators using discrete-continuous convolutions. These approaches preserve the properties of operator learning, allowing for predictions at any resolution. The proposed local neural operators are integrated with Fourier neural operators (FNOs) to significantly improve their performance, reducing the relative $ L^{2} $-error by 34-72% in experiments on turbulent 2D Navier-Stokes and spherical shallow water equations. The results demonstrate that local neural operators outperform existing methods in capturing local features and achieving higher accuracy in scientific computing applications. The framework is applicable to both planar and spherical domains and can be used with unstructured meshes. The approach is validated on several benchmark problems, including Darcy flow, Navier-Stokes equations, diffusion-reaction equations, and shallow water equations, showing consistent improvements in performance. The study highlights the importance of incorporating local inductive biases in neural operators for effective learning of complex scientific models.This paper introduces a novel framework for local neural operators that can capture local features in function spaces. The proposed approach combines differential and integral operators with locally supported kernels to enable the learning of operators that can be applied at any resolution. The key idea is to derive differential operators from convolutional layers by appropriately scaling the kernel values, and to derive local integral operators using discrete-continuous convolutions. These approaches preserve the properties of operator learning, allowing for predictions at any resolution. The proposed local neural operators are integrated with Fourier neural operators (FNOs) to significantly improve their performance, reducing the relative $ L^{2} $-error by 34-72% in experiments on turbulent 2D Navier-Stokes and spherical shallow water equations. The results demonstrate that local neural operators outperform existing methods in capturing local features and achieving higher accuracy in scientific computing applications. The framework is applicable to both planar and spherical domains and can be used with unstructured meshes. The approach is validated on several benchmark problems, including Darcy flow, Navier-Stokes equations, diffusion-reaction equations, and shallow water equations, showing consistent improvements in performance. The study highlights the importance of incorporating local inductive biases in neural operators for effective learning of complex scientific models.
Reach us at info@study.space