We consider noninteracting systems of infinite particles each of which follows an irreducible, null recurrent Markov process and prove a large deviation principle for the empirical density. The expected occupation time (up to time $N$) of this Markov process, named as $h(N)$, plays an essential role in our result. We impose on $h(N)$ a regularly varying property as $N \rightarrow \infty$, which a large class of transition probabilities does satisfy. Some features of our result are: (a) The large deviation tails decay like $\exp\lbrack - Nh^{-1}(N)I(\cdot)\rbrack$, more slowly than the known $\exp\lbrack - NI(\cdot) \rbrack$ type of decay in transient situations. (b) Our rate function $I(\lambda(\cdot))$ equals infinity unless $\lambda(\cdot)$ is an invariant distribution. (c) Our rate function is explicit and is rather insensitive to the underlying Markov process. For instance, if we randomized the time steps of a Markov chain by exponential waiting time of mean 1, the resultant system obeys exactly the same large deviation principle.