Abstract
This paper is a survey on some recent aspects and developments in stochastic control. We discuss the two main historical approaches, Bellman's optimality principle and Pontryagin's maximum principle, and their modern exposition with viscosity solutions and backward stochastic differential equations. Some original proofs are presented in a unifying context including degenerate singular control problems. We emphasize key results on characterization of optimal control for diffusion processes, with a view towards applications. Some examples in finance are detailed with their explicit solutions. We also discuss numerical issues and open questions.