Let $\nu$ denote the value function of a partially observed control problem. If $\nu$ is once differentiable in a certain direction $\hat{B}$, then optimal controls are characterized by a feedback involving the directional derivative $\hat{B}\nu$. It is also shown that $\nu$ satisfies the corresponding Bellman equation, an infinite-dimensional PDE on the space of measures, in the viscosity sense of Crandall and Lions.
@article{1176990737,
author = {Hijab, Omar},
title = {Partially Observed Control of Markov Processes. III},
journal = {Ann. Probab.},
volume = {18},
number = {4},
year = {1990},
pages = { 1099-1125},
language = {en},
url = {http://dml.mathdoc.fr/item/1176990737}
}
Hijab, Omar. Partially Observed Control of Markov Processes. III. Ann. Probab., Tome 18 (1990) no. 4, pp. 1099-1125. http://gdmltest.u-ga.fr/item/1176990737/