Approximate Moments for the Serial Correlation Coefficient
White, John S.
Ann. Math. Statist., Tome 28 (1957) no. 4, p. 798-802 / Harvested from Project Euclid
The first order Gaussian auto-regressive process $(x_t)$ may be defined by the stochastic difference equation \begin{equation*}\tag{1}x_t = \rho x_{t-1} + u_t,\end{equation*} where the $u$'s are NID(0, 1) and $\rho$ is an unknown parameter. The choice of a statistic as an estimator for $\rho$ depends on the initial conditions imposed on the difference equation (1). The so-called "circular" model is obtained by considering a sample of size $N$ and then assuming that $x_{N + 1} = x_1$. An appropriate estimator for $\rho$ in this case is the circular serial correlation coefficient \begin{equation*}\tag{2} r = \frac{\sum^N_{t = 1} x_tx_{t + 1}}{\sum^N_{t = 1} x^2_t}\quad (x_{N + 1} = x_1).\end{equation*} Leipnik [1] has derived an approximate density function \begin{equation*}\tag{3} f(t) = \frac{\Gamma\big(\frac{N + 2}{2}\big)}{\Gamma\big(\frac{N + 1}{2} \Gamma\big(\frac{1}{2}\big)} (1 - 2t\rho + \rho^2)^{-N/2}(1 - t^2)^{(N - 1)/2}\end{equation*} for the estimator $r$. Leipnik also evaluated the first two moments of this distribution. In this paper a formula is obtained which gives $E(r^k)$ as a polynomial of degree $k$ in $\rho$.
Publié le : 1957-09-14
Classification: 
@article{1177706896,
     author = {White, John S.},
     title = {Approximate Moments for the Serial Correlation Coefficient},
     journal = {Ann. Math. Statist.},
     volume = {28},
     number = {4},
     year = {1957},
     pages = { 798-802},
     language = {en},
     url = {http://dml.mathdoc.fr/item/1177706896}
}
White, John S. Approximate Moments for the Serial Correlation Coefficient. Ann. Math. Statist., Tome 28 (1957) no. 4, pp.  798-802. http://gdmltest.u-ga.fr/item/1177706896/