This paper is a survey on some recent aspects and developments in stochastic control. We discuss the two main historical approaches, Bellman's optimality principle and Pontryagin's maximum principle, and their modern exposition with viscosity solutions and backward stoc... View more
 Akian M. (1990) : “Analyse de l'algorithme multigrille FMGH de r´esolution d'´equations d'Hamilton-Jacobi-Bellman”, Analysis and Optimization of systems, Lect. Notes in Contr. and Inf. Sciences, 144, Springer-Verlag, pp. 113-122.
 Antonelli F. (1993) : “Backward-forward stochastic differential equations”, Annals of Appl. Prob., 3, 777-793.
 Artzner P., Delbaen F., Eber J.M. and D. Heath (1999) : “Coherent measures of risk”, Mathematical Finance, 9, 203-228.
 Bally V. and G. Pag`es (2003) : “Error analysis of the optimal quantization algorithm for obstacle problems”, Stochastic Processes and their Applications, 106, 1-40.
 Baras J., Elliott R. and M. Kohlmann (1989) : “The partially obsered stochastic minimum principle”, SIAM J. Cont. Optim., 27, 1279-1292.
 Barles G. and E. Jakobsen (2004) : “Error bounds for monotone approximations schemes for Hamilton-Jacobi-Bellman equations”, to appear in SIAM J. Num. Anal..
 Barles G. and P. Souganidis (1991) : : “Convergence of approximation schemes for fully non linear second-order equations”, Asymptotics Analysis, 4, pp.271-283.
 Bellman R. (1957) : Dynamic programming, Princeton university press.