Using the basic inequalities (1) it is shown that, if, in a sequential probability ratio test, the upper stopping bound is increased and the lower stopping bound decreased, and if the new test is not equivalent to the old one, then at least one of the error probabilities is decreased. This implies the monotonicity result of Weiss [5] in the continuous case, and the uniqueness result of Anderson and Friedman [1] in the general case. The relation of the monotonicity property to the optimum property and the uniqueness of sequential probability ratio tests is discussed. The monotonicity property is a consequence of the following stronger result. Let the old and new tests be given by the stopping bounds $(B', A')$ and $(B, A)$, respectively, with $B < B' < A' < A$; let $(\alpha'_1, \alpha'_2)$ and $(\alpha_1, \alpha_2)$ be the error probabilities and $\Delta\alpha_i = \alpha_i - \alpha'_i$ the changes in the error probabilities; then the vector $(\Delta\alpha_1, \Delta\alpha_2)$ is restricted to a cone consisting of the 3rd quadrant, plus the part of the 2nd quadrant where $-\Delta\alpha_2/\Delta\alpha_1 < B$, plus the part of the 4th quadrant where $-\Delta\alpha_2/\Delta\alpha_1 > A$. Another consequence of this result is that $(\alpha_1, \alpha_2)$ cannot lie in the closed triangle with vertices $(\alpha'_1, \alpha'_2)$, (0, 1) and (1, 0). Finally, the following monotonicity property follows: If the lower stopping bound is fixed and the upper stopping bound increased, then $\alpha_1/(1 - \alpha_2)$ decreases monotonically. The same holds for $\alpha_2/(1 - \alpha_1)$ if the upper stopping bound is held fixed and the lower stopping bound decreased.