A Decomposition Theorem for Vector Variables with a Linear Structure
Rao, C. Radhakrishna
Ann. Math. Statist., Tome 40 (1969) no. 6, p. 1845-1849 / Harvested from Project Euclid
A vector variable $\mathbf{X}$ is said to have a linear structure if it can be written as $\mathbf{X} = \mathbf{AY}$ where $\mathbf{A}$ is a matrix and $\mathbf{Y}$ is a vector of independent random variables called structural variables. In earlier papers the conditions under which a vector random variable admits different structural representations have been studied. It is shown, among other results, that complete non-uniqueness, in some sense, of the linear structure characterizes a multivariate normal variable. In the present paper we prove a general decomposition theorem which states that any vector variable $\mathbf{X}$ with a linear structure can be expressed as the sum $(\mathbf{X}_1 + \mathbf{X}_2)$ of two independent vector variables $\mathbf{X}_1, \mathbf{X}_2$ of which $\mathbf{X}_1$ is non-normal and has a unique linear structure, and $\mathbf{X}_2$ is multivariate normal variable with a nonunique linear structure.
Publié le : 1969-10-14
Classification: 
@article{1177697400,
     author = {Rao, C. Radhakrishna},
     title = {A Decomposition Theorem for Vector Variables with a Linear Structure},
     journal = {Ann. Math. Statist.},
     volume = {40},
     number = {6},
     year = {1969},
     pages = { 1845-1849},
     language = {en},
     url = {http://dml.mathdoc.fr/item/1177697400}
}
Rao, C. Radhakrishna. A Decomposition Theorem for Vector Variables with a Linear Structure. Ann. Math. Statist., Tome 40 (1969) no. 6, pp.  1845-1849. http://gdmltest.u-ga.fr/item/1177697400/