This paper is concerned with optimal estimation of the additive components of a nonparametric, additive regression model. Several different smoothing methods are considered, including kernels, local polynomials, smoothing splines and orthogonal series. It is shown that, asymptotically up to first order, each additive component can be estimated as well as it could be if the other components were known. This result is used to show that in additive models the asymptotically optimal minimax rates and constants are the same as they are in nonparametric regression models with one component.