Neural Operators with Localized Integral and Differential Kernels

Neural Operators with Localized Integral and Differential Kernels

8 Jun 2024 | Miguel Liu-Schiaffini * 1 Julius Berner * 1 Boris Bonev * 2 Thorsten Kurth 2 Kamyar Azizzadenesheli 2 Anima Anandkumar 1
This paper introduces a novel framework for local neural operators, which aim to capture local features in function spaces, particularly useful for solving partial differential equations (PDEs). The authors address the limitations of existing neural operator architectures, such as the Fourier Neural Operator (FNO), which often suffer from over-smoothing and fail to capture local details. They propose two types of localized operators: differential operators and integral operators with locally supported kernels. 1. **Differential Operators**: By scaling the kernel values of convolutional layers appropriately, the authors show that they can converge to differential operators under appropriate conditions. This is achieved by subtracting the mean of the kernel and scaling it by the reciprocal resolution. 2. **Integral Operators**: They utilize discrete-continuous (DISCO) convolutions to define local integral operators, which can be applied to general meshes on planar and spherical geometries. This approach retains the translation equivariance of convolutional layers while allowing for efficient evaluation on unstructured grids. The proposed local neural operators are integrated into existing FNO and spherical FNO (SFNO) architectures, demonstrating significant improvements in performance on various benchmarks, including the Darcy flow, Navier-Stokes equations, shallow water equations, and 2D diffusion-reaction equation. The experiments show that the local operators reduce the relative \(L^2\)-error by up to 72% compared to baseline models. The paper also discusses the theoretical foundations and numerical experiments, highlighting the effectiveness of the proposed approach in capturing fine-grained scales and improving generalization. The results demonstrate that local neural operators can significantly enhance the performance of neural operators in solving real-world scientific computing problems.This paper introduces a novel framework for local neural operators, which aim to capture local features in function spaces, particularly useful for solving partial differential equations (PDEs). The authors address the limitations of existing neural operator architectures, such as the Fourier Neural Operator (FNO), which often suffer from over-smoothing and fail to capture local details. They propose two types of localized operators: differential operators and integral operators with locally supported kernels. 1. **Differential Operators**: By scaling the kernel values of convolutional layers appropriately, the authors show that they can converge to differential operators under appropriate conditions. This is achieved by subtracting the mean of the kernel and scaling it by the reciprocal resolution. 2. **Integral Operators**: They utilize discrete-continuous (DISCO) convolutions to define local integral operators, which can be applied to general meshes on planar and spherical geometries. This approach retains the translation equivariance of convolutional layers while allowing for efficient evaluation on unstructured grids. The proposed local neural operators are integrated into existing FNO and spherical FNO (SFNO) architectures, demonstrating significant improvements in performance on various benchmarks, including the Darcy flow, Navier-Stokes equations, shallow water equations, and 2D diffusion-reaction equation. The experiments show that the local operators reduce the relative \(L^2\)-error by up to 72% compared to baseline models. The paper also discusses the theoretical foundations and numerical experiments, highlighting the effectiveness of the proposed approach in capturing fine-grained scales and improving generalization. The results demonstrate that local neural operators can significantly enhance the performance of neural operators in solving real-world scientific computing problems.
Reach us at info@study.space