Stochastic Controls: Hamiltonian Systems and HJB Equations

Stochastic Controls: Hamiltonian Systems and HJB Equations

| Jiongmin Yong, Xun Yu Zhou
This book provides a comprehensive overview of stochastic controls, Hamiltonian systems, and HJB equations. It is structured into seven chapters, each covering different aspects of stochastic control theory. The first chapter introduces basic stochastic calculus, including probability theory, stochastic processes, stopping times, martingales, Itô's integral, and stochastic differential equations (SDEs). The second chapter discusses stochastic optimal control problems, covering deterministic cases, examples of stochastic control problems, formulations of optimal control problems, existence of optimal controls, reachable sets, and other stochastic control models. The third chapter focuses on the maximum principle and stochastic Hamiltonian systems, detailing the stochastic maximum principle, its proof, sufficient conditions for optimality, and problems with state constraints. The fourth chapter explores dynamic programming and HJB equations, discussing the stochastic principle of optimality, properties of the value function, viscosity solutions, and uniqueness of viscosity solutions. The fifth chapter examines the relationship between the maximum principle and dynamic programming, covering classical Hamilton-Jacobi theory, relationships for deterministic and stochastic systems, and stochastic verification theorems. The sixth chapter addresses linear quadratic optimal control problems, including deterministic LQ problems, stochastic LQ problems, stochastic Riccati equations, and mean-variance portfolio selection. The seventh chapter discusses backward stochastic differential equations (BSDEs), including linear and nonlinear BSDEs, Feynman-Kac-type formulae, forward-backward stochastic differential equations, and option pricing problems. The book also includes a bibliography and index.This book provides a comprehensive overview of stochastic controls, Hamiltonian systems, and HJB equations. It is structured into seven chapters, each covering different aspects of stochastic control theory. The first chapter introduces basic stochastic calculus, including probability theory, stochastic processes, stopping times, martingales, Itô's integral, and stochastic differential equations (SDEs). The second chapter discusses stochastic optimal control problems, covering deterministic cases, examples of stochastic control problems, formulations of optimal control problems, existence of optimal controls, reachable sets, and other stochastic control models. The third chapter focuses on the maximum principle and stochastic Hamiltonian systems, detailing the stochastic maximum principle, its proof, sufficient conditions for optimality, and problems with state constraints. The fourth chapter explores dynamic programming and HJB equations, discussing the stochastic principle of optimality, properties of the value function, viscosity solutions, and uniqueness of viscosity solutions. The fifth chapter examines the relationship between the maximum principle and dynamic programming, covering classical Hamilton-Jacobi theory, relationships for deterministic and stochastic systems, and stochastic verification theorems. The sixth chapter addresses linear quadratic optimal control problems, including deterministic LQ problems, stochastic LQ problems, stochastic Riccati equations, and mean-variance portfolio selection. The seventh chapter discusses backward stochastic differential equations (BSDEs), including linear and nonlinear BSDEs, Feynman-Kac-type formulae, forward-backward stochastic differential equations, and option pricing problems. The book also includes a bibliography and index.
Reach us at info@study.space