A "law of large numbers" for the maximum of i.i.d. univariate normal random variables is extended to a general multivariate case. Let $\mathbf{Z}_i$ denote i.i.d. Banach space valued random variables with a centered Gaussian distribution. Let $\mathbf{K}$ denote the unit ball of the reproducing kernel Hilbert space. Then with probability 1, the maximum distance from the sample points $\mathbf{Z}_1, \mathbf{Z}_2,\ldots, \mathbf{Z}_n$ to the set $\sqrt{2 \log n} \mathbf{K}$ approaches zero. In addition, the sample forms epsilon nets for this set as $n$ tends to infinity.