Perturbation methods provide a powerful technique for treating many problems of applied mathematics. These problems occur very frequently in solid and fluid mechanics, in physics, in engineering and also in economics. The purpose of this book is to describe, analyse and to some extent generalise the principal results concerning perturbation methods in optimal control for systems governed by deterministic or stochastic differential equations. The author aims to present a unified account of the available results. The first two chapters cover the main results in deterministic and stochastic optimal control theory, and in ergodic control theory. The remaining chapters deal with the applications of perturbation methods in deterministic and stochastic optimal control. Regular and singular perturbations are treated separately. Two broad categories of methods are used: the theory of necessary conditions, leading to Pontryagin's maximum principle, and the theory of sufficient conditions, leading to dynamic programming.