This paper addresses the topic of model selection in regression.We
emphasize the case of two models, testing which model provides a better
prediction based on $n$ observations. Within a family of selection rules, based
on maximizing a penalized log-likelihood under a normal model, we search for
asymptotically minimax rules over a class $\mathscr{G}$ of possible joint
distributions of the explanatory and response variables. For the class
$\mathscr{G}$ of multivariate normal joint distributions it is shown that
asymptotically minimax selection rules are close to the AIC selection rule when
the models’ dimension difference is large. It is further proved that
under fairly mild assumptions on $\mathscr{G}$ any asymptotically minimax
sequence of procedures satisfies the condition that the difference in their
dimension penalties is bounded as the number of observations approaches
infinity. The results are then extended to the case of more than two competing
models.