Suppose $X_1, X_2, \dots, X_{\nu - 1}$ are iid random variables with distribution $F_0$, and $X_{\nu}, X_{\nu + 1}, \dots$ are are iid with distributed $F_1$. The change point $\nu$ is unknown. The problem is to raise an alarm as soon as possible after the distribution changes from $F_0$ to $F_1$ (detect the change), but to avoid false alarms.
¶ Pollak found a version of the Shiryayev-Roberts procedure to be asymptotically optimal for the problem of minimizing the average run length to detection over all stopping times which satisfy a given constraint on the rate of false alarms. Here we find that this procedure is strictly optimal for a slight reformulation of the problem he considered.
¶ Explicit formulas are developed for the calculation of the average run length (both before and after the change) for the optimal stopping time.