The problem of estimating unknown parameters of Markov-additive processes from data observed up to a random stopping time is considered. To the problem of estimation, the intermediate approach between the Bayes and the minimax principle is applied in which it is assumed that a vague prior information on the distribution of the unknown parameters is available. The loss in estimating is assumed to consist of the error of estimation (defined by a weighted squared loss function) as well as a cost of observing the process up to a stopping time. Several classes of optimal sequential procedures are obtained explicitly in the case when the available information on the prior distribution is restricted to a set Γ which is determined by certain moment-type conditions imposed on the prior distributions.
@article{bwmeta1.element.bwnjournal-article-doi-10_4064-am28-4-7, author = {Ryszard Magiera}, title = {$\Gamma$-minimax sequential estimation for Markov-additive processes}, journal = {Applicationes Mathematicae}, volume = {28}, year = {2001}, pages = {467-485}, zbl = {1008.62660}, language = {en}, url = {http://dml.mathdoc.fr/item/bwmeta1.element.bwnjournal-article-doi-10_4064-am28-4-7} }
Ryszard Magiera. Γ-minimax sequential estimation for Markov-additive processes. Applicationes Mathematicae, Tome 28 (2001) pp. 467-485. http://gdmltest.u-ga.fr/item/bwmeta1.element.bwnjournal-article-doi-10_4064-am28-4-7/