Convergence Rate Analysis of Markov Chains
MetadataShow full item record
We consider a number of Markov chains and derive bounds for the rate at which convergence to equilibrium occurs. For our main problem, we establish results for the rate of convergence in total variation of a Gibbs sampler to its equilibrium distribution. This sampler is motivated by a hierarchical Bayesian inference construction for a gamma random variable. The Bayesian hierarchical method involves statistical models that incorporate prior beliefs about the likelihood of observed data to arrive at posterior interpretations, and appears in applications for information technology, statistical genetics, market research and others. Our results apply to a wide range of parameter values in the case that the hierarchical depth is 3 or 4, and are more restrictive for depth greater than 4. Our method involves showing a relationship between the total variation of two ordered copies of our chain and the maximum of the ratios of their respective co-ordinates. We construct auxiliary stochastic processes to show that this ratio does converge to 1 at a geometric rate. In addition, we also consider a stochastic image restoration model proposed by A. Gibbs, and give an upper bound on the time it takes for a Markov chain defined by this model to be arbitrarily close in total variation to equilibrium. We use Gibbs' result for convergence in the Wasserstein metric to arrive at our result. Our bound for the time to equilibrium is of similar order to that of Gibbs.