The finite-sample risk of the $k$ nearest neighbor classifier that
uses a weighted $L^p$-metric as a measure of class similarity is examined. For
a family of classification problems with smooth distributions in $mathbb{R}^n$,
an asymptotic expansion for the risk is obtained in decreasing fractional
powers of the reference sample size. An analysis of the leading expansion
coefficients reveals that the optimal weighted $L^p$-metric, that is, the
metric that minimizes the finite-sample risk, tends to a weighted Euclidean
(i.e., $L^2$) metric as the sample size is increased. Numerical simulations
corroborate this finding for a pattern recognition problem with normal
class-conditional densities.