The following theorem is proved. Let $X_1, X_2, \cdots, X_n$ be $n$ independently (but not necessarily identically) distributed random variables, and assume that the $n$th moment of each $X_i(i = 1, 2, \cdots, n)$ exists. The necessary and sufficient conditions for the existence of two statistically independent linear forms $Y_1 = \sum^n_{s=1} a_sX_s$ and $Y_2 = \sum^n_{s=1}b_sX_s$ are: (A) Each random variable which has a nonzero coefficient in both forms is normally distributed. $(B) \sum^n_{s=1}a_sb_s\sigma^2_s = 0$. Here $\sigma^2_s$ denotes the variance of $X_s (s = 1, 2, \cdots, n)$. For $n = 2$ and $a_1 = b_1 = a_2 = 1, b_2 = -1$ this reduces to a theorem of S. Bernstein [1]. Bernstein's paper was not accessible to the authors, whose knowledge of his result was derived from a statement of S. Bernstein's theorem contained in a paper by M. Frechet [3]. A more general result, not assuming the existence of moments was obtained earlier by M. Kac [4]. A related theorem, assuming equidistribution of the $X_i (i = 1, 2, \cdots n)$ is stated without proof in a recent paper by Yu. V. Linnik [5].