The problem of predicting a future measurement on an individual given the past measurements is discussed under nonparametric and parametric growth models. The efficiencies of different methods of prediction are assessed by cross-validation or leave-one-out technique in each of three data sets and the results are compared. Under nonparametric models, direct and inverse regression methods of prediction are described and their relative advantages and disadvantages are discussed. Under parametric models polynomial and factor analytic type growth curves are considered. Bayesian and empirical Bayesian methods are used to deal with unknown parameters. A general finding is that much of the information for forecasting is contained in the immediate past few observations or a few summary statistics based on past data. A number of data reduction methods are suggested and analyses based on them are described. The usefulness of the leave-one-out technique in model selection is demonstrated. A new method of calibration is introduced to improve prediction.
Publié le : 1987-11-14
Classification:
Bayesian approach,
calibration,
cross-validation,
empirical Bayes,
factor analytic model,
inverse regression,
leave-one-out method,
mixed model,
part correlation,
polynomial model,
predictive density,
principal component regression
@article{1177013119,
author = {Rao, C. Radhakrishna},
title = {Prediction of Future Observations in Growth Curve Models},
journal = {Statist. Sci.},
volume = {2},
number = {4},
year = {1987},
pages = { 434-447},
language = {en},
url = {http://dml.mathdoc.fr/item/1177013119}
}
Rao, C. Radhakrishna. Prediction of Future Observations in Growth Curve Models. Statist. Sci., Tome 2 (1987) no. 4, pp. 434-447. http://gdmltest.u-ga.fr/item/1177013119/