Consider a stationary $m$-dependent sequence of random indicator variables. If $m > 1$, assume further that any two nonzero values are separated by at least $m - 1$ zeros. This paper studies the sequence of the lengths of the successive intervals between the nonzero values of the original sequence, and it is shown that, provided a technical condition holds, these lengths converge in distribution (and their moments converge exponentially fast) in all cases but one.