Minimum Variance Order when Estimating the Location of an Irregularity in the Density
Polfeldt, Thomas
Ann. Math. Statist., Tome 41 (1970) no. 6, p. 673-679 / Harvested from Project Euclid
Let $f(y)$ be a probability density on the real line, with \begin{equation*}\tag{1} f(y) = ^+R(y) (y > 0),\quad f(y) = ^-R(|y|) (y < 0)\end{equation*} where $^+R$ and $^-R$ are normalized slowly varying functions as $y \rightarrow 0$ (cf. [6] chapter 8, sect. 8). Let $\theta$ be a location parameter. Denote by $X$ a sample $(x_1, \cdots, x_n)$ of $n$ independent observations from the distribution defined by $f(x - \theta)$. By $t = t(X)$ we denote an unbiased estimator of $\theta$. In this paper we study lower bounds for the variances $V_\theta(t)$, with special reference to their order, in $n$. Considering only densities with regular variation at $\theta$, (1) includes all cases where $\theta$ is the location either of a cusp or of a discontinuity with finite and positive values of $f(0 -)$ and $f(0 +)$. Under some conditions on $^+R$ and $^-R$, we calculate a function $\psi(h)$ such that \begin{equation*}\tag{2} V_\theta(t) \geqq K(\psi^{-1}(n^{-1}))^2\quad (\text{all} t) (0 < K < \infty).\end{equation*} It is surmised that this lower variance bound is of the best possible order of $n$. The bound is sometimes $o(n^{-1})$; hyperefficient estimators should then be possible. It is found that $\psi(h)$ depends heavily on the function $^+ \varepsilon(s)$ defined by $^+ R(y) = ^+ A \exp \{-\int^1_{y^+} \varepsilon(s)/s ds\}$, and on the corresponding function $^- \varepsilon(s)$. In view of [6] chapter 8, sect. 9 (or (10) below), this form of $^+R(y)$ is not a constraint on $f(y)$. The estimators $t_0$ constructed by Daniels [5] and Prakasa Rao [10] (for particular $^+R$ and $^-R$) have variances of the order of (2). This order is thus optimal with the densities considered: we have $V_\theta(t_0) \geqq \inf V_\theta(t) \geqq K(\psi^{-1}(n^{-1}))^2$. The Prakasa Rao estimators are hyperefficient. The calculations are based, partly, on ideas from the author's paper [9]. Since a cusp may be a mode, the results of this paper contribute to the discussion on the estimation of the mode (see [4], [11] and references therein). The generalization of (1) to regularly varying $f(y)$ as $y \downarrow 0$ and $y \uparrow 0$ will be treated elsewhere ([8]).--The generalization of (2) to biased estimators $t$ (or mean square error) is straightforward, but some conditions on the bias function will be necessary. Notation. $K$ and $K'$ denote positive, finite constants. If there exist $K, K'$ such that $K < a(x)/b(x) < K'$ for all $x, |x| < x_0$, we shall write $a(x) = \Omega(b(x)) (x \rightarrow 0)$; sometimes we omit $(x \rightarrow 0)$.
Publié le : 1970-04-14
Classification: 
@article{1177697112,
     author = {Polfeldt, Thomas},
     title = {Minimum Variance Order when Estimating the Location of an Irregularity in the Density},
     journal = {Ann. Math. Statist.},
     volume = {41},
     number = {6},
     year = {1970},
     pages = { 673-679},
     language = {en},
     url = {http://dml.mathdoc.fr/item/1177697112}
}
Polfeldt, Thomas. Minimum Variance Order when Estimating the Location of an Irregularity in the Density. Ann. Math. Statist., Tome 41 (1970) no. 6, pp.  673-679. http://gdmltest.u-ga.fr/item/1177697112/