We consider the problems of (i) covariance adjustment of an unbiased estimator, (ii) combining two unbiased estimators, and (iii) improving upon an unbiased estimator. All these problems consist in determining a minimum dispersion linear unbiased combination of given two statistics, one of which is an unbiased estimator of a vector parameter $\mathbf{\theta} \in \mathscr{H}$, and the expectation of the other is a zero vector in the problem of covariance adjustment, is equal to $\mathbf{\theta}$ in the problem of combining, and is equal to a subvector of $\mathbf{\theta}$ in the problem of improving. The solutions obtained are substantial generalizations of known results, in the sense that they are valid for an arbitrary joint dispersion matrix of the given statistics as well as for the parameter space $\mathscr{H}$ being an arbitrary subspace of $\mathscr{R}^k$.