17 May 2021 | Zongyi Li, Nikola Kovachki, Kamyar Azizzadenesheli, Burigede Liu, Kaushik Bhattacharya, Andrew Stuart, Anima Anandkumar
The paper introduces the Fourier Neural Operator (FNO), a novel deep learning architecture designed to learn mappings between infinite-dimensional function spaces. Unlike classical neural networks that map between finite-dimensional spaces, FNOs directly parameterize integral operators in Fourier space, allowing for expressive and efficient architectures. The FNO is particularly useful for solving parametric partial differential equations (PDEs), as it can learn the entire family of solutions corresponding to different functional parametric dependencies, rather than solving a single instance of the equation.
Key contributions of the FNO include:
1. **Resolution-Invariant Solutions**: The FNO achieves zero-shot super-resolution, meaning it can be trained on lower resolution data and evaluated on higher resolution data without explicit supervision.
2. **Superior Accuracy**: The FNO outperforms existing deep learning methods in terms of accuracy, achieving up to 30% lower error rates compared to other benchmarks on Burgers' equation, Darcy flow, and Navier-Stokes equations.
3. **Efficiency**: The FNO is significantly faster than traditional PDE solvers, with an inference time of only 0.005 seconds on a 256x256 grid, compared to 2.2 seconds for pseudo-spectral methods.
The FNO is constructed using an iterative architecture that involves lifting the input to a higher-dimensional representation, applying integral operators and activation functions, and projecting back to the target dimension. The integral operators are parameterized in Fourier space, leveraging the Fast Fourier Transform (FFT) to achieve quasi-linear complexity. This approach ensures that the FNO is mesh-invariant and can handle different discretizations and geometries.
Experiments on various PDEs, including Burgers' equation, Darcy flow, and Navier-Stokes equations, demonstrate the FNO's effectiveness in both time-independent and time-dependent problems. The FNO also shows promise in Bayesian inverse problems, where it can efficiently approximate complex operators and provide accurate solutions with minimal computational cost.The paper introduces the Fourier Neural Operator (FNO), a novel deep learning architecture designed to learn mappings between infinite-dimensional function spaces. Unlike classical neural networks that map between finite-dimensional spaces, FNOs directly parameterize integral operators in Fourier space, allowing for expressive and efficient architectures. The FNO is particularly useful for solving parametric partial differential equations (PDEs), as it can learn the entire family of solutions corresponding to different functional parametric dependencies, rather than solving a single instance of the equation.
Key contributions of the FNO include:
1. **Resolution-Invariant Solutions**: The FNO achieves zero-shot super-resolution, meaning it can be trained on lower resolution data and evaluated on higher resolution data without explicit supervision.
2. **Superior Accuracy**: The FNO outperforms existing deep learning methods in terms of accuracy, achieving up to 30% lower error rates compared to other benchmarks on Burgers' equation, Darcy flow, and Navier-Stokes equations.
3. **Efficiency**: The FNO is significantly faster than traditional PDE solvers, with an inference time of only 0.005 seconds on a 256x256 grid, compared to 2.2 seconds for pseudo-spectral methods.
The FNO is constructed using an iterative architecture that involves lifting the input to a higher-dimensional representation, applying integral operators and activation functions, and projecting back to the target dimension. The integral operators are parameterized in Fourier space, leveraging the Fast Fourier Transform (FFT) to achieve quasi-linear complexity. This approach ensures that the FNO is mesh-invariant and can handle different discretizations and geometries.
Experiments on various PDEs, including Burgers' equation, Darcy flow, and Navier-Stokes equations, demonstrate the FNO's effectiveness in both time-independent and time-dependent problems. The FNO also shows promise in Bayesian inverse problems, where it can efficiently approximate complex operators and provide accurate solutions with minimal computational cost.