"Perfect sampling" enables exact draws from the invariant measure of a Markov chain. We show that the independent Metropolis-hastings chain has certain stochastic monotonicity properties that enable a perfec...
详细信息
"Perfect sampling" enables exact draws from the invariant measure of a Markov chain. We show that the independent Metropolis-hastings chain has certain stochastic monotonicity properties that enable a perfect sampling algorithm to be implemented, at least when the candidate is overdispersed with respect to the target distribution. We prove that the algorithm has an optimal geometric convergence rate, and applications show that it may converge extreme rapidly. (C) 2002 Elsevier Science B.V. All rights reserved.
This paper considers positive recurrent Markov chains where the probability of remaining in the current state is arbitrarily close to 1, Specifically, conditions are given which ensure the non-existence of central lim...
详细信息
This paper considers positive recurrent Markov chains where the probability of remaining in the current state is arbitrarily close to 1, Specifically, conditions are given which ensure the non-existence of central limit theorems for ergodic averages of functionals of the chain. The results are motivated by applications for Metropolis-hastings algorithms which are constructed in terms of a rejection probability (where a rejection involves remaining at the current state). Two examples for commonly used algorithms are given, for the independence sampler and the Metropolis-adjusted Langevin algorithm. The examples are rather specialized, although, in both cases, the problems which arise are typical of problems commonly occurring for the particular algorithm being used.
We apply recent results in Markov chain theory to hastings and Metropolis algorithms with either independent or symmetric candidate distributions, and provide necessary and sufficient conditions for the algorithms to ...
详细信息
We apply recent results in Markov chain theory to hastings and Metropolis algorithms with either independent or symmetric candidate distributions, and provide necessary and sufficient conditions for the algorithms to converge at a geometric rate to a prescribed distribution pi. In the independence case (in R(k)) these indicate that geometric convergence essentially occurs if and only if the candidate density is bounded below by a multiple of pi;in the symmetric case (in R only) we show geometric convergence essentially occurs if and only if pi has geometric tails. We also evaluate recently developed computable bounds on the rates of convergence in this context: examples show that these theoretical bounds can be inherently extremely conservative, although when the chain is stochastically monotone the bounds may well be effective.
Here I propose a convergence diagnostic for Markov chain Monte Carlo (MCMC) algorithms based on couplings of a Markov chain with an auxiliary chain that is periodically restarted from a fixed parameter value. The diag...
详细信息
Here I propose a convergence diagnostic for Markov chain Monte Carlo (MCMC) algorithms based on couplings of a Markov chain with an auxiliary chain that is periodically restarted from a fixed parameter value. The diagnostic provides a mechanism for estimating the specific constants governing the rate of convergence of geometrically and uniformly ergodic chains, and provides a lower bound on the effective sample size of a MCMC run. It also provides a simple procedure for obtaining what is, with high probability, an independent sample from the stationary distribution.
In this paper we consider a continuous-time method of approximating a given distribution π using the Langevin diffusion dLt=dWt+1/2∇log π(Lt)dt. We find conditions under this diffusion converges exponentially quickl...
详细信息
Markov chain Monte Carlo (MCMC) methods have been used extensively in statistical physics over the last 40 years, in spatial statistics for the past 20 and in Bayesian image analysis over the last decade. In the last ...
详细信息
Markov chain Monte Carlo (MCMC) methods have been used extensively in statistical physics over the last 40 years, in spatial statistics for the past 20 and in Bayesian image analysis over the last decade. In the last five years, MCMC has been introduced into significance testing, general Bayesian inference and maximum likelihood estimation. This paper presents basic methodology of MCMC, emphasizing the Bayesian paradigm, conditional probability and the intimate relationship with Markov random fields in spatial statistics. hastings algorithms are discussed, including Gibbs, Metropolis and some other variations. Pairwise difference priors are described and are used subsequently in three Bayesian applications, in each of which there is a pronounced spatial or temporal aspect to the modeling. The examples involve logistic regression in the presence of unobserved covariates and ordinal factors;the analysis of agricultural field experiments, with adjustment for fertility gradients;and processing of low-resolution medical images obtained by a gamma camera. Additional methodological issues arise in each of these applications and in the Appendices. The paper lays particular emphasis on the calculation of posterior probabilities and concurs with others in its view that MCMC facilitates a fundamental breakthrough in applied Bayesian modeling.
暂无评论