It is shown that the mean of a normal distribution with unknown variance \sigma^2 may be estimated to lie within an interval of given fixed width at a prescribed confidence level using a procedure which overcomes ignorance about \sigma^2 with no more than a finite number of observations. That is, the expected sample size exceeds the (fixed) sample size one would use if \sigma^2 were known by a finite amount, the difference depending on the confidence level \alpha but not depending on the values of the mean \mu, the variance \sigma^2 and the interval width 2d. A number of unpublished results on the moments of the sample size are presented. Some do not depend on an assumption of normality.