Extended gcd calculation has a long history and plays an important role
in computational number theory and linear algebra. Recent results have
shown that finding optimal multipliers in extended gcd calculations is
difficult. We present an algorithm which uses lattice basis reduction to
produce small integer multipliers $x_1,\dots,x_m$ for the equation
$s=\gcd{(s_1,\dots,s_m)}=x_1s_1+\cdots+x_ms_m$, where $s_1,\dots,s_m$
are given integers. The method generalises to produce small unimodular
transformation matrices for computing the Hermite normal form of an
integer matrix.