August 23, 2017 | Maziar Raissi and George Em Karniadakis
The paper introduces a novel paradigm called "hidden physics models" for learning partial differential equations (PDEs) from small and expensive data. The approach leverages Gaussian processes to balance model complexity and data fitting, enabling the extraction of patterns from high-dimensional data generated by experiments. The methodology is applicable to various scientific domains, including fluid dynamics, quantum mechanics, and pattern formation. The effectiveness of the method is demonstrated through canonical problems such as the Navier-Stokes, Schrödinger, Kuramoto-Sivashinsky, and fractional equations. The hidden physics model explicitly encodes the underlying physical laws in its covariance functions, allowing for efficient parameter learning from limited data. The results show that more data, less noise, and a smaller gap between snapshots enhance the performance of the algorithm.The paper introduces a novel paradigm called "hidden physics models" for learning partial differential equations (PDEs) from small and expensive data. The approach leverages Gaussian processes to balance model complexity and data fitting, enabling the extraction of patterns from high-dimensional data generated by experiments. The methodology is applicable to various scientific domains, including fluid dynamics, quantum mechanics, and pattern formation. The effectiveness of the method is demonstrated through canonical problems such as the Navier-Stokes, Schrödinger, Kuramoto-Sivashinsky, and fractional equations. The hidden physics model explicitly encodes the underlying physical laws in its covariance functions, allowing for efficient parameter learning from limited data. The results show that more data, less noise, and a smaller gap between snapshots enhance the performance of the algorithm.