For the problem of estimating a regression function, $\mu$ say,
subject to shape constraints, like monotonicity or convexity, it is argued that
the divergence of the maximum likelihood estimator provides a useful measure of
the effective dimension of the model. Inequalities are derived for the expected
mean squared error of the maximum likelihood estimator and the expected
residual sum of squares. These generalize equalities from the case of linear
regression. As an application, it is shown that the maximum likelihood
estimator of the error variance $\sigma^2$ is asymptotically normal with mean
$\sigma^2$ and variance $2\sigma_2/n$. For monotone regression, it is shown
that the maximum likelihood estimator of $\mu$ attains the optimal rate of
convergence, and a bias correction to the maximum likelihood estimator of
$\sigma^2$ is derived.