In this paper a test based on the union-intersection principle is proposed for overall independence between $p$ variates distributed according to the multivariate normal law, and this is extended to the hypothesis of independence between several groups of variates which have a joint multivariate normal distribution. Methods used in earlier papers [3, 4] have been applied in order to invert these tests for each situation, and to obtain, with a joint confidence coefficient greater than or equal to a preassigned level, simultaneous confidence bounds on certain parametric functions. These parametric functions are, in case I, the moduli of the regression vectors: (a) of the variate $p$ on the variates $(p - 1), (p - 2), \cdots, 2, 1,$ or on any subset of the latter; (b) of the variate $(p - 1)$ on the variates $(p - 2), (p - 3), \cdots, 2, 1,$ or any subset of the latter, etc.; and finally, (c) of the variate 2 on the variate 1. For case II, parallel to each case considered above, there is an analogous statement in which the regression vector is replaced by a regression matrix, $\beta$, say, and the "modulus" of the regression vector is replaced by the (positive) square-root of the largest characteristic root of $(\beta\beta')$. Simultaneous confidence bounds on these sets of parameters are given. As far as the proposed tests of hypotheses of multiple independence are concerned they are offered as an alternative to another class of tests based on the likelihood-ratio criterion [5, 6] which has been known for a long time. So far as the confidence bounds are concerned it is believed, however, that no other easily obtainable confidence bounds are available in this area. One of the objects of these confidence bounds is the detection of the "culprit variates" in the case of rejection of the hypothesis of multiple independence, for the "complex" hypothesis is, in this case, the intersection of several more "elementary" hypotheses of two-by-two independence.