Relative entropy under mappings by stochastic matrices

Joel E. Cohen, Yoh Iwasa, Gh Rautu, Mary Beth Ruskai, Eugene Seneta, Gh Zbaganu

研究成果: Contribution to journalArticle査読

47 被引用数 (Scopus)

抄録

The relative g-entropy of two finite, discrete probability distributions x = (x1,...,xn) and y = (y1,...,yn) is defined as Hg(x,y) = Σkxkg (yk/kk - 1), where g:(-1,∞)→R is convex and g(0) = 0. When g(t) = -log(1 + t), then Hg(x,y) = Σkxklog(xk/yk), the usual relative entropy. Let Pn = {x ∈ Rn : σixi = 1, xi > 0 ∀i}. Our major results is that, for any m × n column-stochastic matrix A, the contraction coefficient defined as ηǧ(A) = sup{Hg(Ax,Ay)/Hg(x,y) : x,y ∈ Pn, x ≠ y} satisfies ηg(A) ≤1 - α(A), where α(A) = minj,kΣi min(aij, aik) is Dobrushin's coefficient of ergodicity. Consequently, ηg(A) < 1 if and only if A is scrambling. Upper and lower bounds on αg(A) are established. Analogous results hold for Markov chains in continuous time.

本文言語英語
ページ(範囲)211-235
ページ数25
ジャーナルLinear Algebra and Its Applications
179
C
DOI
出版ステータス出版済み - 1 15 1993

All Science Journal Classification (ASJC) codes

  • 代数と数論
  • 数値解析
  • 幾何学とトポロジー
  • 離散数学と組合せ数学

フィンガープリント

「Relative entropy under mappings by stochastic matrices」の研究トピックを掘り下げます。これらがまとまってユニークなフィンガープリントを構成します。

引用スタイル