Grassberger suggested an interesting entropy estimator, namely, $\frac{n \log n}{\sum^n_{i=1} L^n_i},$ where $L^n_i$ is the shortest prefix of $x_i, x_{i+1},\ldots$, which is not a prefix of any other $x_j, x_{j+1},\ldots,$ for $j \leq n$. We show that this estimator is not consistent for the general ergodic process, although it is consistent for Markov chains. A weaker trimmed mean type result is proved for the general case, namely, given $\varepsilon > 0$, eventually almost surely all but an $\varepsilon$ fraction of the $L^n_i/\log n$ will be within $\varepsilon$ of $1/H$. A related Hausdorff dimension conjecture is shown to be false.