We study the problem of estimating θ from data Y ~ N(θ,σ2) under squared-error loss. We define three new scalar minimax problems in which the risk is weighted by the size of θ. Simple thresholding gives asymptotically minimax estimates in all three problems. We indicate the relationships of the new problems to each other and to two other neo-classical problems: the problems of the bounded normal mean and of the risk-constrained normal mean. Via the wavelet transform, these results have implications for adaptive function estimation in two settings: estimating functions of unknown type and degree of smoothness in a global ℓ2 norm; and estimating a function of unknown degree of local Hölder smoothness at a fixed point. In the latter setting, the scalar minimax results imply: Lepskii's results that it is not possible fully to adapt the unknown degree of smoothness without incurring a performance cost; and that simple thresholding of the empirical wavelet transform gives an estimate of a function at a fixed point which is, to within constants, optimally adaptive to unknown degree of smoothness.