Necessary and sufficient conditions are given for the existence of a uniformly consistent estimate of an unknown parameter $\theta$ when the successive observations are not necessarily independent and the number of unknown parameters involved in the joint distribution of the observations increases indefinitely with the number of observations. In analogy with R. A. Fisher's information function, the amount of information contained in the first $n$ observations regarding $\theta$ is defined. A sufficient condition for the non-existence of a uniformly consistent estimate of $\theta$ is given in section 3 in terms of the information function. Section 4 gives a simplified expression for the amount of information when the successive observations are independent.