Dynamic Programming is the analysis of multistage decision in the sequential mode. It is now widely recognized as a tool of great versatility and power, and is applied to an increasing extent in all phases of economic analysis, operations research, technology, and also in mathematical theory itself. In economics and operations research its impact may someday rival that of linear programming. The importance of this field is made apparent through a growing number of publications. Foremost among these is the pioneering work of Bellman. It was he who originated the basic ideas, formulated the principle of optimality, recognized its power, coined the terminology, and developed many of the present applications. Since then mathe maticians, statisticians, operations researchers, and economists have come in, laying more rigorous foundations [KARLIN, BLACKWELL], and developing in depth such application as to the control of stochastic processes [HoWARD, JEWELL]. The field of inventory control has almost split off as an independent branch of Dynamic Programming on which a great deal of effort has been expended [ARRoW, KARLIN, SCARF], [WIDTIN] , [WAGNER]. Dynamic Programming is also playing an in creasing role in modem mathematical control theory [BELLMAN, Adap tive Control Processes (1961)]. Some of the most exciting work is going on in adaptive programming which is closely related to sequential statistical analysis, particularly in its Bayesian form. In this monograph the reader is introduced to the basic ideas of Dynamic Programming.
- ISBN13 9783642864513
- Publish Date 18 April 2012 (first published 1 January 1968)
- Publish Status Active
- Publish Country DE
- Publisher Springer-Verlag Berlin and Heidelberg GmbH & Co. KG
- Imprint Springer-Verlag Berlin and Heidelberg GmbH & Co. K
- Edition Softcover reprint of the original 1st ed. 1968
- Format Paperback
- Pages 144
- Language English