In a right-angled triangle, the hypotenuse is the longest side. So, if all (hypotenuse) vectors from a given set of vectors have the same orthogonal projection onto a certain subspace, we have a lower bound for their lengths. Interpreting the square of such a length as the variance of an unbiased estimator produces an information bound. The Cramér-Rao bound and the van Trees inequality can be seen as consequences of this bound. Another consequence is an inequality for the minimax variance, that is, the maximal variance in shrinking neighbourhoods, minimized over all unbiased estimators. This bound is non-asymptotic and requires almost no regularity conditions.