An optimum design of experiment for a class of estimates of the first derivative at 0 (used in stochastic approximation and density estimation) is shown to be equivalent to the problem of finding a point of minimum of the function $\Gamma$ defined by $\Gamma (x) = \det\lbrack 1, x^3,\ldots, x^{2m-1} \rbrack/\det\lbrack x, x^3,\ldots, x^{2m-1} \rbrack$ on the set of all $m$-dimensional vectors with components satisfying $0 < x_1 < -x_2 < \cdots < (-1)^{m-1} x_m$ and $\Pi|x_i| = 1$. (In the determinants, 1 is the column vector with all components 1, and $x^i$ has components of $x$ raised to the $i$-th power.) The minimum of $\Gamma$ is shown to be $m$, and the point at which the minimum is attained is characterized by Chebyshev polynomials of the second kind.