M. C. K. Tweedie [2] defined the inverse Gaussian distributions via the density functions \begin{equation*}\tag{1}f(x; m, \lambda)= \lbrack\lambda/(2\pi x^3)\rbrack^{\frac{1}{2}} \exp \lbrack -\lambda(x - m)^2/(2m^2x)\rbrack\quad\text{for}\quad x > 0 = 0\quad\text{for} x \leqq 0\end{equation*} The parameters $\lambda$ and $m$ are positive. The corresponding densities reflected about the origin, and with $\lambda$ and $m$ negative, may also be considered as in the Inverse Gaussian family. The characteristic function of the Inverse Gaussian distribution with parameters $\lambda, m$ is \begin{equation*}\tag{2}\phi(t) = \exp \lbrack\lambda\{1 - (1 - 2im^2t\lambda^{-1})^{\frac{1}{2}}\}/m\rbrack,\quad i = \sqrt{-1}\end{equation*} for all real values of $t$. If $x_1, x_2, \cdots, x_n$ are $n$ independent observations from (1), then $y = \sum^n_{j=1} x_j$ and $z = \sum^n_{j=1} x^{-1}_j - n^2y^{-1}$ are independently distributed. The distribution of $y$ is $f(y, nm, n^2\lambda)$ and that of $\lambda z$ is Chi-Square with $(n - 1)$ degrees of freedom. In this note, we prove that, if $x_1, x_2, \cdots, x_n$ are independently and identically distributed variates, with the existence of certain moments (different from zero), and if $y$ and $z$ are independently distributed, then the distribution of $x_j$ is Inverse Gaussian.