We establish upper and lower bounds on the asymptotic minimax risk in estimating (1) a density at a point when the density is known to be decreasing with a Lipschitz condition; (2) a density at a point when the density satisfies a local second-order smoothness (Sacks-Ylvisaker) condition; and (3) the $k$th derivative of the density at a point, when the density satisfies a local $L_p$ constraint on the $m$th derivative. In (1), (2) and (3) the upper and lower bounds differ asymptotically by less than 18%, 24.3% and 25%, respectively. Our bounds on the asymptotic minimax risk come from a simple formula. Let $\omega(\varepsilon)$ denote the modulus of continuity, with respect to Hellinger distance, of the functional to be estimated; in the previous cases this has the form $\omega(\varepsilon) = A\varepsilon^r(1 + o(1))$ for certain constants $A$ and $r$. Then, in all these cases, the minimax risk is not larger asymptotically than $r^r(1 - r)^{1 - r}\omega^2(n^{-1/2})/4$ and is at best a few percent smaller. The modulus of continuity of the functional and hence the geometry of the problem, determine the difficulty of estimation. At a technical level, two interesting aspects of our work are (1) derivation of minimax affine estimates of a linear functional in the white noise model with general convex asymmetric a priori class and (2) the use of Le Cam's theory of convergence of experiments to show that the density model is asymptotically just as hard as the white noise model. At a conceptual level, an interesting aspect of our work is the use of the hardest one-dimensional subproblem heuristic. Our method works because in these cases, the difficulty of the hardest one-dimensional subproblem is essentially equal to the difficulty of the full infinite-dimensional problem.