In generalized ridge estimation, the components of the ordinary least squares (OLS) regression coefficient vector which lie along the principal axes of the given regressor data are rescaled using known ridge factors. Generalizing a result of Swindel and Chapman, it is shown that, if each ridge factor is nonstochastic, nonnegative, and less than one, then there is at most one unknown direction in regression coefficient space along which ridge coefficients have larger mean squared error than do OLS coefficients. Then, by decomposing the mean squared error of a ridge estimator into components parallel to and orthogonal to the unknown true regression coefficient vector, new insight is gained about definitions for optimal factors. Estimators of certain unknown quantities are displayed which are maximum likelihood or unbiased under normal theory or which have correct range.