We anticipate that future web search techniques will exploit changes in web structure and content. As a first step in this direction, we examine the problem of integrating observed changes in link structure into static hyperlink-based ranking computations
¶ We present a very efficient algorithm to incrementally compute good approximations to Google's PageRank [Brin and Page 98], as links evolve. Our experiments reveal that this algorithm is both fast and yields excellent approximations to PageRank, even in light of large changes to the link structure
¶ Our algorithm derives intuition and partial justification from a rigorous sensitivity analysis of Markov chains. Consider a regular Markov chain with stationary probability $\pi$, and suppose the transition probability into a state $j$ is increased. We prove that this can only cause
$\pi_j$ to incease---adding a link to a site can only cause the stationary probability of the target site to increase;
the rank of $j$ to improve---if the states are ordered according to their stationary probabilities, then adding a link to a site can only cause the rank of the target site to improve.
¶ This analysis formalizes why the intuition that drives Google never fails.