3 Jun 2024 | Pratik Rathore, Weimu Lei, Zachary Frangella, Lu Lu, Madeleine Udell
This paper explores the challenges in training Physics-Informed Neural Networks (PINNs), focusing on the role of the loss landscape. It examines the difficulties in minimizing the PINN loss function, particularly due to ill-conditioning caused by differential operators in the residual term. The paper compares gradient-based optimizers Adam, L-BFGS, and their combination Adam+L-BFGS, showing that Adam+L-BFGS is superior. It introduces a novel second-order optimizer, NysNewton-CG (NNCG), which significantly improves PINN performance. Theoretically, the work elucidates the connection between ill-conditioned differential operators and ill-conditioning in the PINN loss, demonstrating the benefits of combining first- and second-order optimization methods. The paper presents valuable insights and more powerful optimization strategies for training PINNs, which could improve their utility in solving complex partial differential equations.This paper explores the challenges in training Physics-Informed Neural Networks (PINNs), focusing on the role of the loss landscape. It examines the difficulties in minimizing the PINN loss function, particularly due to ill-conditioning caused by differential operators in the residual term. The paper compares gradient-based optimizers Adam, L-BFGS, and their combination Adam+L-BFGS, showing that Adam+L-BFGS is superior. It introduces a novel second-order optimizer, NysNewton-CG (NNCG), which significantly improves PINN performance. Theoretically, the work elucidates the connection between ill-conditioned differential operators and ill-conditioning in the PINN loss, demonstrating the benefits of combining first- and second-order optimization methods. The paper presents valuable insights and more powerful optimization strategies for training PINNs, which could improve their utility in solving complex partial differential equations.