This book is divided into two parts. The first part covers deterministic optimal control theory, including calculus of variations, necessary conditions for an optimum, existence and regularity theorems for optimal controls, and dynamic programming. The second part introduces stochastic optimal control for Markov diffusion processes, using dynamic programming and the relationship between second-order partial differential equations and stochastic differential equations. The book includes examples, theorems, and applications in areas such as control theory, mathematical optimization, and Markov processes. It also discusses the stochastic linear regulator and the separation principle. The authors thank colleagues who reviewed the book and Bell Telephone Laboratories for their support. The book is structured with chapters on calculus of variations, optimal control problems, existence and continuity properties of optimal controls, dynamic programming, stochastic differential equations, and optimal control of Markov diffusion processes. Appendices provide additional mathematical tools. The book is intended for graduate-level students and researchers in mathematics, economics, and engineering.This book is divided into two parts. The first part covers deterministic optimal control theory, including calculus of variations, necessary conditions for an optimum, existence and regularity theorems for optimal controls, and dynamic programming. The second part introduces stochastic optimal control for Markov diffusion processes, using dynamic programming and the relationship between second-order partial differential equations and stochastic differential equations. The book includes examples, theorems, and applications in areas such as control theory, mathematical optimization, and Markov processes. It also discusses the stochastic linear regulator and the separation principle. The authors thank colleagues who reviewed the book and Bell Telephone Laboratories for their support. The book is structured with chapters on calculus of variations, optimal control problems, existence and continuity properties of optimal controls, dynamic programming, stochastic differential equations, and optimal control of Markov diffusion processes. Appendices provide additional mathematical tools. The book is intended for graduate-level students and researchers in mathematics, economics, and engineering.