In many statistical applications, nonparametric modeling can provide
insights into the features of a dataset that are not obtainable by other means.
One successful approach involves the use of (univariate or multivariate) spline
spaces. As a class, these methods have inherited much from classical tools for
parametric modeling. For example, stepwise variable selection with spline basis
terms is a simple scheme for locating knots (breakpoints) in regions where the
data exhibit strong, local features. Similarly, candidate knot configurations
(generated by this or some other search technique), are routinely evaluated
with traditional selection criteria like AIC or BIC. In short, strategies
typically applied in parametric model selection have proved useful in
constructing flexible, low-dimensional models for nonparametric problems.
¶ Until recently, greedy, stepwise procedures were most frequently
suggested in the literature. Research into Bayesian variable selection,
however, has given rise to a number of new spline-based methods that primarily
rely on some form of Markov chain Monte Carlo to identify promising knot
locations. In this paper, we consider various alternatives to greedy,
deterministic schemes, and present a Bayesian framework for studying adaptation
in the context of an extended linear model (ELM). Our major test cases are
Logspline density estimation and (bivariate) Triogram regression models. We
selected these because they illustrate a number of computational and
methodological issues concerning model adaptation that arise in ELMs.