Motivated by recently developed threshold rules for wavelet
estimators, we suggest threshold methods for general kernel density estimators,
including those of classical Rosenblatt–Parzen type. Thresholding makes
kernel methods competitive in terms of their adaptivity to a wide variety of
aberrations in complex signals. It is argued that term-by-term thresholding
does not always produce optimal performance, since individual coefficients
cannot be estimated sufficiently accurately for reliable decisions to be made.
Therefore, we suggest grouping coefficients into blocks and making simultaneous
threshold decisions about all coefficients within a given block. It is argued
that block thresholding has a number of advantages, including that it produces
adaptive estimators which achieve minimax-optimal convergence rates without the
logarithmic penalty that is sometimes associated with term-by-term
thresholding. More than this, the convergence rates are achieved over large
classes of functions with discontinuities, indeed with a number of
discontinuities that diverges polynomially fast with sample size. These results
are also established for block thresholded wavelet estimators, which, although
they can be interpreted within the kernel framework, are often most
conveniently constructed in a slightly different way.