6 juillet 2006 | Jean-Michel Lasry, Pierre-Louis Lions
This paper continues the study of mean field games introduced in a previous note. It focuses on Nash equilibria for stochastic control problems with a finite horizon. The authors present general existence and uniqueness results for the partial differential equations systems derived from these problems. They also provide an interpretation of these systems in terms of optimal control. The paper includes detailed mathematical formulations and theorems, such as Theorem 2.1 and Theorem 2.2, which establish the existence of regular solutions under certain conditions. Additionally, Theorem 3.1 ensures uniqueness of solutions when the operators \( V \) and \( v_0 \) are strictly monotone. The paper also discusses the interpretation of these systems in the context of optimal control of partial differential equations and explores various extensions and variants of the system, including the case of multiple populations of players.This paper continues the study of mean field games introduced in a previous note. It focuses on Nash equilibria for stochastic control problems with a finite horizon. The authors present general existence and uniqueness results for the partial differential equations systems derived from these problems. They also provide an interpretation of these systems in terms of optimal control. The paper includes detailed mathematical formulations and theorems, such as Theorem 2.1 and Theorem 2.2, which establish the existence of regular solutions under certain conditions. Additionally, Theorem 3.1 ensures uniqueness of solutions when the operators \( V \) and \( v_0 \) are strictly monotone. The paper also discusses the interpretation of these systems in the context of optimal control of partial differential equations and explores various extensions and variants of the system, including the case of multiple populations of players.