This paper presents convergence conditions for a Markov chain
constructed using Gibbs sampling, when the equilibrium distribution is the
conditional sampling distribution of sufficient statistics from a generalized
linear model. For cases when this unidimensional sampling is done approximately
rather than exactly, the difference between the target equilibrium distribution
and the resulting equilibrium distribution is expressed in terms of the
difference between the true and approximating univariate conditional
distributions. These methods are applied to an algorithm facilitating
approximate conditional inference in canonical exponential families.