Let $(X_i)_{i \in \mathbb{Z}^d_+}$ be an array of independent identically distributed zero-mean random vectors with values in $\mathbb{R}^k$. When $E(|X_1|^r) < + \infty$, for some $r > 2$, we obtain the strong approximation of the partial sum process $(\sum_{i \in \nu S}X_i: S \in \mathscr{J})$ by a Gaussian partial sum process $(\sum_{i \in \nu S}Y_i: S \in \mathscr{J})$, uniformly over all sets in a certain Vapnik-Chervonenkis class $\mathscr{J}$ of subsets of $\lbrack 0, 1\rbrack^d$. The most striking result is that both an array $(X_i)_{i \in \mathbb{Z}^d_+}$ of i.i.d. random vectors and an array $(Y_i)_{i \in \mathbb{Z}^d_+}$ of independent $N(0, \operatorname{Var} X_1)$-distributed random vectors may be constructed in such a way that, up to a power of $\log \nu$, $\sup_S \in \mathscr{J} |\sum_{i \in \nu S} (X_i - Y_i)| = O(\nu^{(d - 1)/2} \vee \nu^{d/r}) \mathrm{a.s.},$ for any Vapnik-Chervonenkis class $\mathscr{J}$ fulfilling the uniform Minkowsky condition. From a 1985 paper of Beck, it is straightforward to prove that such a result cannot be improved, when $\mathscr{J}$ is the class of Euclidean balls.