Consider a general linear model for a column vector $y$ of data having $E(y) = X \alpha$ and $\operatorname{Var}(y) = \sigma^2H$, where $\alpha$ is a vector of unknown parameters and $X$ and $H$ are given matrices that are possibly deficient in rank. Let $b = Ty$, where $T$ is any matrix of maximum rank such that $TH = \phi$. The estimation of a linear function of $\alpha$ by functions of the form $c + a'y$, where $c$ and $a$ are permitted to depend on $b$, is investigated. Allowing $c$ and $a$ to depend on $b$ expands the class of unbiased estimators in a nontrivial way; however, it does not add to the class of linear functions of $\alpha$ that are estimable. Any minimum-variance unbiased estimator is identically [for $y$ in the column space of $(X, H)$] equal to the estimator that has minimum variance among strictly linear unbiased estimators.