Factor analysis and its extensions are widely used in the social and
behavioral sciences, and can be considered useful tools for exploration and
model fitting in multivariate analysis. Despite its popularity in applications,
factor analysis has attracted rather limited attention from statisticians.
Three issues, identification ambiguity, heavy reliance on normality, and
limitation to linearity, may have contributed to statisticians' lack of
interest in factor analysis. In this paper, the statistical contributions to
the first two issues are reviewed, and the third issue is addressed in detail.
Linear models can be unrealistic even as an approximation in many applications,
and often do not fit the data well without increasing the number of factors
beyond the level explainable by the subject-matter theory. As an exploratory
model, the conventional factor analysis model fails to address nonlinear
structure underlying multivariate data. It is argued here that factor analysis
does not need to be restricted to linearity and that nonlinear factor analysis
can be formulated and carried out as a useful statistical method. In
particular, for a general parametric nonlinear factor analysis model, the
errors- in-variables parameterization is suggested as a sensible way to
formulate the model, and two procedures for model fitting are introduced and
described. Tests for the goodness-of-fit of the model are also proposed. The
procedures are studied through a simulation study. An example from personality
testing is used to illustrate the issues and the methods.