In a general variance component model, nonnegative quadratic estimators of the components of variance are considered which are invariant with respect to mean value translations and have minimum bias, analogously to estimation theory of mean value parameters. Here the minimum is taken over an appropriate cone of positive semidefinite matrices, after having made a reduction by invariance. Among these estimators, which always exist, the one of minimum norm is characterized. This characterization is achieved by systems of necessary and sufficient conditions, and by a nonlinear cone-restricted pseudoinverse. A representation of this pseudoinverse is given, that allows computation without consideration of the boundary. In models where the decomposing covariance matrices span a commutative quadratic subspace, a representation of the considered estimator is derived that requires merely to solve an ordinary convex quadratic optimization problem. As an example, we present the two-way nested classification random model. In the case that unbiased nonnegative quadratic estimation is possible, this estimator automatically becomes the "nonnegative MINQUE". Besides this, a general representation of the MINQUE is given, that involves just one matrix pseudoinversion in the reduced model.