We introduce backoff processes, an idealized stochastic model of
browsing on the World Wide Web, which incorporates both hyperlink traversals
and use of the “back button.” With some probability the next
state is generated by a distribution over out-edges from the current state, as
in a traditional Markov chain.With the remaining probability, however, the next
state is generated by clicking on the back button and returning to the state
from which the current state was entered by a “forward step.”
Repeated clicks on the back button require access to increasingly distant
history. We show that this process has fascinating similarities to and
differences from Markov chains. In particular, we prove that, like Markov
chains, back-off processes always have a limit distribution, and we give
algorithms to compute this distribution. Unlike Markov chains, the limit
distribution may depend on the start state.