
Pontraygin's maximum principle (MP) involving the Hamiltonian system and Bellman's dynamic programming (DP) involving the HJB equation are the two most important approaches in modern optimal control theory. However, these two approaches have been developed separately in literature and it has been a long-standing, yet fundamentally important problem to disclose the relationship between them and to establish a unified theory. The problem is by no means a "new" one; indeed, it roots in the Hamilton-Jacobi theory in analytic mechanics and method of characteristics in classical PDE theory, and has intrinsic relation with the Feynman-Kac formula in stochastic analysis and shadow price theory in economics. This paper discusses some deep connections between the MP and DP in stochastic optimal controls from various aspects.
| selected citations These citations are derived from selected sources. This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | 0 | |
| popularity This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network. | Average | |
| influence This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | Average | |
| impulse This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network. | Average |
