DeepONet: Learning nonlinear operators for identifying differential equations based on the universal approximation theorem of operators

DeepONet: Learning nonlinear operators for identifying differential equations based on the universal approximation theorem of operators

15 Apr 2020 | Lu Lu1, Pengzhan Jin2, and George Em Karniadakis1
DeepONet is a neural network architecture designed to learn nonlinear operators from data, leveraging the universal approximation theorem of operators. The network consists of two sub-networks: a branch net that encodes input functions at fixed sensor locations and a trunk net that encodes output function locations. This structure allows DeepONet to efficiently and accurately approximate operators, such as dynamic systems and partial differential equations, even with a relatively small dataset. The network's performance is evaluated through systematic simulations, demonstrating significant reductions in generalization error compared to fully-connected networks. Theoretical analysis shows that the approximation error depends on the number of sensors and the type of input function, with computational results validating these findings. DeepONet also exhibits high-order error convergence, including polynomial and exponential rates, indicating its effectiveness in learning complex operators. The paper concludes that DeepONet provides a powerful tool for learning nonlinear operators, with potential applications in various scientific and engineering domains.DeepONet is a neural network architecture designed to learn nonlinear operators from data, leveraging the universal approximation theorem of operators. The network consists of two sub-networks: a branch net that encodes input functions at fixed sensor locations and a trunk net that encodes output function locations. This structure allows DeepONet to efficiently and accurately approximate operators, such as dynamic systems and partial differential equations, even with a relatively small dataset. The network's performance is evaluated through systematic simulations, demonstrating significant reductions in generalization error compared to fully-connected networks. Theoretical analysis shows that the approximation error depends on the number of sensors and the type of input function, with computational results validating these findings. DeepONet also exhibits high-order error convergence, including polynomial and exponential rates, indicating its effectiveness in learning complex operators. The paper concludes that DeepONet provides a powerful tool for learning nonlinear operators, with potential applications in various scientific and engineering domains.
Reach us at info@study.space