Relative entropy under mappings by stochastic matrices

Joel E. Cohen, Yoh Iwasa, Gh Rautu, Mary Beth Ruskai, Eugene Seneta, Gh Zbaganu

Research output: Contribution to journalArticle

43 Citations (Scopus)

Abstract

The relative g-entropy of two finite, discrete probability distributions x = (x1,...,xn) and y = (y1,...,yn) is defined as Hg(x,y) = Σkxkg (yk/kk - 1), where g:(-1,∞)→R is convex and g(0) = 0. When g(t) = -log(1 + t), then Hg(x,y) = Σkxklog(xk/yk), the usual relative entropy. Let Pn = {x ∈ Rn : σixi = 1, xi > 0 ∀i}. Our major results is that, for any m × n column-stochastic matrix A, the contraction coefficient defined as ηǧ(A) = sup{Hg(Ax,Ay)/Hg(x,y) : x,y ∈ Pn, x ≠ y} satisfies ηg(A) ≤1 - α(A), where α(A) = minj,kΣi min(aij, aik) is Dobrushin's coefficient of ergodicity. Consequently, ηg(A) < 1 if and only if A is scrambling. Upper and lower bounds on αg(A) are established. Analogous results hold for Markov chains in continuous time.

Original languageEnglish
Pages (from-to)211-235
Number of pages25
JournalLinear Algebra and Its Applications
Volume179
Issue numberC
DOIs
Publication statusPublished - Jan 15 1993

All Science Journal Classification (ASJC) codes

  • Algebra and Number Theory
  • Numerical Analysis
  • Geometry and Topology
  • Discrete Mathematics and Combinatorics

Fingerprint Dive into the research topics of 'Relative entropy under mappings by stochastic matrices'. Together they form a unique fingerprint.

  • Cite this

    Cohen, J. E., Iwasa, Y., Rautu, G., Beth Ruskai, M., Seneta, E., & Zbaganu, G. (1993). Relative entropy under mappings by stochastic matrices. Linear Algebra and Its Applications, 179(C), 211-235. https://doi.org/10.1016/0024-3795(93)90331-H