Recently, Rissanen proposed a new model selection criterion PLS that selects the model that minimizes the accumulated squares of prediction errors. Usually, the information-based criteria, such as AIC and BIC, select the model that minimizes a loss function which can be expressed as a sum of two terms. One measures the goodness of fit and the other penalizes the complexity of the selected model. In this paper we provide such an interpretation for PLS. Using this relationship, we give sufficient conditions for PLS to be strongly consistent in stochastic regression models. The asymptotic equivalence between PLS and BIC for ergodic models is then studied. Finally, based on the Fisher information, a new criterion FIC is proposed. This criterion shares most asymptotic properties with PLS while removing some of the difficulties encountered by PLS in a finite-sample situation.