Properties of jeffreys mixture for markov sources

Junnichi Takeuchi, Tsutomu Kawabata, Andrew R. Barron

Research output: Contribution to journalArticle

7 Citations (Scopus)

Abstract

We discuss the properties of Jeffreys mixture for a Markov model. First, we show that a modified Jeffreys mixture asymptotically achieves the minimax coding regret for universal data compression, where we do not put any restriction on data sequences. Moreover, we give an approximation formula for the prediction probability of Jeffreys mixture for a Markov model. By this formula, it is revealed that the prediction probability by Jeffreys mixture for the Markov model with alphabet {0,1\} is not of the form (nx\s + α)/(ns + β), where nx\s is the number of occurrences of the symbol x following the context s {0,1} and ns = n0\s + n1\s. Moreover, we propose a method to compute our minimax strategy, which is a combination of a Monte Carlo method and the approximation formula, where the former is used for earlier stages in the data, while the latter is used for later stages.

Original languageEnglish
Article number6307868
Pages (from-to)438-457
Number of pages20
JournalIEEE Transactions on Information Theory
Volume59
Issue number1
DOIs
Publication statusPublished - Jan 7 2013

Fingerprint

coding
symbol
Data compression
Monte Carlo methods

All Science Journal Classification (ASJC) codes

  • Information Systems
  • Computer Science Applications
  • Library and Information Sciences

Cite this

Properties of jeffreys mixture for markov sources. / Takeuchi, Junnichi; Kawabata, Tsutomu; Barron, Andrew R.

In: IEEE Transactions on Information Theory, Vol. 59, No. 1, 6307868, 07.01.2013, p. 438-457.

Research output: Contribution to journalArticle

Takeuchi, Junnichi ; Kawabata, Tsutomu ; Barron, Andrew R. / Properties of jeffreys mixture for markov sources. In: IEEE Transactions on Information Theory. 2013 ; Vol. 59, No. 1. pp. 438-457.
@article{452505284d2949de88b925d44f6e7d81,
title = "Properties of jeffreys mixture for markov sources",
abstract = "We discuss the properties of Jeffreys mixture for a Markov model. First, we show that a modified Jeffreys mixture asymptotically achieves the minimax coding regret for universal data compression, where we do not put any restriction on data sequences. Moreover, we give an approximation formula for the prediction probability of Jeffreys mixture for a Markov model. By this formula, it is revealed that the prediction probability by Jeffreys mixture for the Markov model with alphabet {0,1\} is not of the form (nx\s + α)/(ns + β), where nx\s is the number of occurrences of the symbol x following the context s {0,1} and ns = n0\s + n1\s. Moreover, we propose a method to compute our minimax strategy, which is a combination of a Monte Carlo method and the approximation formula, where the former is used for earlier stages in the data, while the latter is used for later stages.",
author = "Junnichi Takeuchi and Tsutomu Kawabata and Barron, {Andrew R.}",
year = "2013",
month = "1",
day = "7",
doi = "10.1109/TIT.2012.2219171",
language = "English",
volume = "59",
pages = "438--457",
journal = "IEEE Transactions on Information Theory",
issn = "0018-9448",
publisher = "Institute of Electrical and Electronics Engineers Inc.",
number = "1",

}

TY - JOUR

T1 - Properties of jeffreys mixture for markov sources

AU - Takeuchi, Junnichi

AU - Kawabata, Tsutomu

AU - Barron, Andrew R.

PY - 2013/1/7

Y1 - 2013/1/7

N2 - We discuss the properties of Jeffreys mixture for a Markov model. First, we show that a modified Jeffreys mixture asymptotically achieves the minimax coding regret for universal data compression, where we do not put any restriction on data sequences. Moreover, we give an approximation formula for the prediction probability of Jeffreys mixture for a Markov model. By this formula, it is revealed that the prediction probability by Jeffreys mixture for the Markov model with alphabet {0,1\} is not of the form (nx\s + α)/(ns + β), where nx\s is the number of occurrences of the symbol x following the context s {0,1} and ns = n0\s + n1\s. Moreover, we propose a method to compute our minimax strategy, which is a combination of a Monte Carlo method and the approximation formula, where the former is used for earlier stages in the data, while the latter is used for later stages.

AB - We discuss the properties of Jeffreys mixture for a Markov model. First, we show that a modified Jeffreys mixture asymptotically achieves the minimax coding regret for universal data compression, where we do not put any restriction on data sequences. Moreover, we give an approximation formula for the prediction probability of Jeffreys mixture for a Markov model. By this formula, it is revealed that the prediction probability by Jeffreys mixture for the Markov model with alphabet {0,1\} is not of the form (nx\s + α)/(ns + β), where nx\s is the number of occurrences of the symbol x following the context s {0,1} and ns = n0\s + n1\s. Moreover, we propose a method to compute our minimax strategy, which is a combination of a Monte Carlo method and the approximation formula, where the former is used for earlier stages in the data, while the latter is used for later stages.

UR - http://www.scopus.com/inward/record.url?scp=84871784306&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84871784306&partnerID=8YFLogxK

U2 - 10.1109/TIT.2012.2219171

DO - 10.1109/TIT.2012.2219171

M3 - Article

AN - SCOPUS:84871784306

VL - 59

SP - 438

EP - 457

JO - IEEE Transactions on Information Theory

JF - IEEE Transactions on Information Theory

SN - 0018-9448

IS - 1

M1 - 6307868

ER -