Additive principal components are a nonlinear generalization of linear principal components. Their distinguishing feature is that linear forms $\sum_ia_i X_i$ are replaced with additive functions $\sum_i\phi_i(X_i)$. A considerable amount of flexibility for fitting data is gained when linear methods are replaced with additive ones. Our interest is in the smallest principal components, which is somewhat uncommon. Smallest additive principal components amount to data descriptions in terms of approximate implicit equations: $\sum_i\phi_i(X_i) \approx 0$. Estimation of such equations is a data-analytic method in its own right, competing in some cases with the more customary regression approaches. It is also a diagnostic tool in additive regression for detection of so-called "concurvity." This term describes degeneracies among predictor variables in additive regression models, similar to collinearity in linear regression models. Concurvity may lead to statistically unstable contributions of variables to additive models. As an example, we show in a reanalysis of the ozone data from Breiman and Friedman that concurvity does indeed exist in this particular data, a fact which may impact the interpretation of the additive fits. In the second half of this paper, we give some second-order theory, including the description of null situations and eigenexpansions derived from associated eigenproblems. We show how ACE and additive principal components are related, and we outline some analytical methods for theoretical calculations of additive principal components. Lastly we consider methods of estimation and computation. Additive principal components have had a long tradition in psychometric research and correspondence analysis. They have started receiving attention by statisticians only in recent years.