On Estimating the Slope of a Straight Line when Both Variables are Subject to Error
Spiegelman, Clifford
Ann. Statist., Tome 7 (1979) no. 1, p. 201-206 / Harvested from Project Euclid
Let $X_i$ and $Y_i$ be random variables related to other random variables $U_i, V_i$, and $W_i$ as follows: $X_i = U_i + W_i, Y_i = \alpha + \beta U_i + V_i, i = 1, \cdots, n$, where $\alpha$ and $\beta$ are finite constants. Here $X_i$ and $Y_i$ are observable while $U_i, V_i$ and $W_i$ are not. This model is customarily referred to as the regression problem with errors in both variables and the central question is the estimation of $\beta$. We give a class of estimates for $\beta$ which are asymptotically normal with mean $\beta$ and variance proportional to $1/n^{\frac{1}{2}}$, under weak assumptions. We then show how to choose a good estimate of $\beta$ from this class.
Publié le : 1979-01-14
Classification:  62-02,  Errors in variables,  regression,  asymptotic distribution,  62J05
@article{1176344565,
     author = {Spiegelman, Clifford},
     title = {On Estimating the Slope of a Straight Line when Both Variables are Subject to Error},
     journal = {Ann. Statist.},
     volume = {7},
     number = {1},
     year = {1979},
     pages = { 201-206},
     language = {en},
     url = {http://dml.mathdoc.fr/item/1176344565}
}
Spiegelman, Clifford. On Estimating the Slope of a Straight Line when Both Variables are Subject to Error. Ann. Statist., Tome 7 (1979) no. 1, pp.  201-206. http://gdmltest.u-ga.fr/item/1176344565/