TY - GEN
T1 - A polynomial-time perfect sampler for the Q-ising with a vertex-independent noise
AU - Yamamoto, M.
AU - Kijima, S.
AU - Matsui, Y.
N1 - Copyright:
Copyright 2012 Elsevier B.V., All rights reserved.
PY - 2009
Y1 - 2009
N2 - We present a polynomial-time perfect sampler for the Q-Ising with a vertex-independent noise. The Q-Ising, one of the generalized models of the Ising, arose in the context of Bayesian image restoration in statistical mechanics. We study the distribution of Q-Ising on a two-dimensional square lattice over n vertices, that is, we deal with a discrete state space {1,...,Q} n for a positive integer Q. Employing the Q-Ising (having a parameter β) as a prior distribution, and assuming a Gaussian noise (having another parameter α), a posterior is obtained from the Bayes' formula. Furthermore, we generalize it: the distribution of noise is not necessarily a Gaussian, but any vertex-independent noise. We first present a Gibbs sampler from our posterior, and also present a perfect sampler by defining a coupling via a monotone update function. Then, we show O(nlogn) mixing time of the Gibbs sampler for the generalized model under a condition that β is sufficiently small (whatever the distribution of noise is). In case of a Gaussian, we obtain another more natural condition for rapid mixing that α is sufficiently larger than β. Thereby, we show that the expected running time of our sampler is O(nlogn).
AB - We present a polynomial-time perfect sampler for the Q-Ising with a vertex-independent noise. The Q-Ising, one of the generalized models of the Ising, arose in the context of Bayesian image restoration in statistical mechanics. We study the distribution of Q-Ising on a two-dimensional square lattice over n vertices, that is, we deal with a discrete state space {1,...,Q} n for a positive integer Q. Employing the Q-Ising (having a parameter β) as a prior distribution, and assuming a Gaussian noise (having another parameter α), a posterior is obtained from the Bayes' formula. Furthermore, we generalize it: the distribution of noise is not necessarily a Gaussian, but any vertex-independent noise. We first present a Gibbs sampler from our posterior, and also present a perfect sampler by defining a coupling via a monotone update function. Then, we show O(nlogn) mixing time of the Gibbs sampler for the generalized model under a condition that β is sufficiently small (whatever the distribution of noise is). In case of a Gaussian, we obtain another more natural condition for rapid mixing that α is sufficiently larger than β. Thereby, we show that the expected running time of our sampler is O(nlogn).
UR - http://www.scopus.com/inward/record.url?scp=76249108006&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=76249108006&partnerID=8YFLogxK
U2 - 10.1007/978-3-642-02882-3_33
DO - 10.1007/978-3-642-02882-3_33
M3 - Conference contribution
AN - SCOPUS:76249108006
SN - 3642028810
SN - 9783642028816
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 328
EP - 337
BT - Computing and Combinatorics - 15th Annual International Conference, COCOON 2009, Proceedings
T2 - 15th Annual International Conference on Computing and Combinatorics, COCOON 2009
Y2 - 13 July 2009 through 15 July 2009
ER -