Relative entropy under mappings by stochastic matrices

Joel E. Cohen, Yoh Iwasa, Gh Rautu, Mary Beth Ruskai, Eugene Seneta, Gh Zbaganu

Research output: Contribution to journalArticle

38 Citations (Scopus)

Abstract

The relative g-entropy of two finite, discrete probability distributions x = (x1,...,xn) and y = (y1,...,yn) is defined as Hg(x,y) = Σkxkg (yk/kk - 1), where g:(-1,∞)→R is convex and g(0) = 0. When g(t) = -log(1 + t), then Hg(x,y) = Σkxklog(xk/yk), the usual relative entropy. Let Pn = {x ∈ Rn : σixi = 1, xi > 0 ∀i}. Our major results is that, for any m × n column-stochastic matrix A, the contraction coefficient defined as ηǧ(A) = sup{Hg(Ax,Ay)/Hg(x,y) : x,y ∈ Pn, x ≠ y} satisfies ηg(A) ≤1 - α(A), where α(A) = minj,kΣi min(aij, aik) is Dobrushin's coefficient of ergodicity. Consequently, ηg(A) < 1 if and only if A is scrambling. Upper and lower bounds on αg(A) are established. Analogous results hold for Markov chains in continuous time.

Original languageEnglish
Pages (from-to)211-235
Number of pages25
JournalLinear Algebra and Its Applications
Volume179
Issue numberC
DOIs
Publication statusPublished - Jan 15 1993

Fingerprint

Stochastic Matrix
Relative Entropy
Entropy
Discrete Distributions
Coefficient
Ergodicity
Markov processes
Probability distributions
Continuous Time
Upper and Lower Bounds
Contraction
Markov chain
Probability Distribution
If and only if

All Science Journal Classification (ASJC) codes

  • Algebra and Number Theory
  • Numerical Analysis
  • Geometry and Topology
  • Discrete Mathematics and Combinatorics

Cite this

Cohen, J. E., Iwasa, Y., Rautu, G., Beth Ruskai, M., Seneta, E., & Zbaganu, G. (1993). Relative entropy under mappings by stochastic matrices. Linear Algebra and Its Applications, 179(C), 211-235. https://doi.org/10.1016/0024-3795(93)90331-H

Relative entropy under mappings by stochastic matrices. / Cohen, Joel E.; Iwasa, Yoh; Rautu, Gh; Beth Ruskai, Mary; Seneta, Eugene; Zbaganu, Gh.

In: Linear Algebra and Its Applications, Vol. 179, No. C, 15.01.1993, p. 211-235.

Research output: Contribution to journalArticle

Cohen, JE, Iwasa, Y, Rautu, G, Beth Ruskai, M, Seneta, E & Zbaganu, G 1993, 'Relative entropy under mappings by stochastic matrices', Linear Algebra and Its Applications, vol. 179, no. C, pp. 211-235. https://doi.org/10.1016/0024-3795(93)90331-H
Cohen JE, Iwasa Y, Rautu G, Beth Ruskai M, Seneta E, Zbaganu G. Relative entropy under mappings by stochastic matrices. Linear Algebra and Its Applications. 1993 Jan 15;179(C):211-235. https://doi.org/10.1016/0024-3795(93)90331-H
Cohen, Joel E. ; Iwasa, Yoh ; Rautu, Gh ; Beth Ruskai, Mary ; Seneta, Eugene ; Zbaganu, Gh. / Relative entropy under mappings by stochastic matrices. In: Linear Algebra and Its Applications. 1993 ; Vol. 179, No. C. pp. 211-235.
@article{5e6d7be8bbff49ae890a1fc2541c8a6e,
title = "Relative entropy under mappings by stochastic matrices",
abstract = "The relative g-entropy of two finite, discrete probability distributions x = (x1,...,xn) and y = (y1,...,yn) is defined as Hg(x,y) = Σkxkg (yk/kk - 1), where g:(-1,∞)→R is convex and g(0) = 0. When g(t) = -log(1 + t), then Hg(x,y) = Σkxklog(xk/yk), the usual relative entropy. Let Pn = {x ∈ Rn : σixi = 1, xi > 0 ∀i}. Our major results is that, for any m × n column-stochastic matrix A, the contraction coefficient defined as ηǧ(A) = sup{Hg(Ax,Ay)/Hg(x,y) : x,y ∈ Pn, x ≠ y} satisfies ηg(A) ≤1 - α(A), where α(A) = minj,kΣi min(aij, aik) is Dobrushin's coefficient of ergodicity. Consequently, ηg(A) < 1 if and only if A is scrambling. Upper and lower bounds on αg(A) are established. Analogous results hold for Markov chains in continuous time.",
author = "Cohen, {Joel E.} and Yoh Iwasa and Gh Rautu and {Beth Ruskai}, Mary and Eugene Seneta and Gh Zbaganu",
year = "1993",
month = "1",
day = "15",
doi = "10.1016/0024-3795(93)90331-H",
language = "English",
volume = "179",
pages = "211--235",
journal = "Linear Algebra and Its Applications",
issn = "0024-3795",
publisher = "Elsevier Inc.",
number = "C",

}

TY - JOUR

T1 - Relative entropy under mappings by stochastic matrices

AU - Cohen, Joel E.

AU - Iwasa, Yoh

AU - Rautu, Gh

AU - Beth Ruskai, Mary

AU - Seneta, Eugene

AU - Zbaganu, Gh

PY - 1993/1/15

Y1 - 1993/1/15

N2 - The relative g-entropy of two finite, discrete probability distributions x = (x1,...,xn) and y = (y1,...,yn) is defined as Hg(x,y) = Σkxkg (yk/kk - 1), where g:(-1,∞)→R is convex and g(0) = 0. When g(t) = -log(1 + t), then Hg(x,y) = Σkxklog(xk/yk), the usual relative entropy. Let Pn = {x ∈ Rn : σixi = 1, xi > 0 ∀i}. Our major results is that, for any m × n column-stochastic matrix A, the contraction coefficient defined as ηǧ(A) = sup{Hg(Ax,Ay)/Hg(x,y) : x,y ∈ Pn, x ≠ y} satisfies ηg(A) ≤1 - α(A), where α(A) = minj,kΣi min(aij, aik) is Dobrushin's coefficient of ergodicity. Consequently, ηg(A) < 1 if and only if A is scrambling. Upper and lower bounds on αg(A) are established. Analogous results hold for Markov chains in continuous time.

AB - The relative g-entropy of two finite, discrete probability distributions x = (x1,...,xn) and y = (y1,...,yn) is defined as Hg(x,y) = Σkxkg (yk/kk - 1), where g:(-1,∞)→R is convex and g(0) = 0. When g(t) = -log(1 + t), then Hg(x,y) = Σkxklog(xk/yk), the usual relative entropy. Let Pn = {x ∈ Rn : σixi = 1, xi > 0 ∀i}. Our major results is that, for any m × n column-stochastic matrix A, the contraction coefficient defined as ηǧ(A) = sup{Hg(Ax,Ay)/Hg(x,y) : x,y ∈ Pn, x ≠ y} satisfies ηg(A) ≤1 - α(A), where α(A) = minj,kΣi min(aij, aik) is Dobrushin's coefficient of ergodicity. Consequently, ηg(A) < 1 if and only if A is scrambling. Upper and lower bounds on αg(A) are established. Analogous results hold for Markov chains in continuous time.

UR - http://www.scopus.com/inward/record.url?scp=0242380938&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=0242380938&partnerID=8YFLogxK

U2 - 10.1016/0024-3795(93)90331-H

DO - 10.1016/0024-3795(93)90331-H

M3 - Article

VL - 179

SP - 211

EP - 235

JO - Linear Algebra and Its Applications

JF - Linear Algebra and Its Applications

SN - 0024-3795

IS - C

ER -