### Abstract

We discuss the properties of Jeffreys mixture for a Markov model. First, we show that a modified Jeffreys mixture asymptotically achieves the minimax coding regret for universal data compression, where we do not put any restriction on data sequences. Moreover, we give an approximation formula for the prediction probability of Jeffreys mixture for a Markov model. By this formula, it is revealed that the prediction probability by Jeffreys mixture for the Markov model with alphabet {0,1\} is not of the form (n_{x\s} + α)/(n_{s} + β), where n_{x\s} is the number of occurrences of the symbol x following the context s {0,1} and n_{s} = n_{0\s} + n_{1\s}. Moreover, we propose a method to compute our minimax strategy, which is a combination of a Monte Carlo method and the approximation formula, where the former is used for earlier stages in the data, while the latter is used for later stages.

Original language | English |
---|---|

Article number | 6307868 |

Pages (from-to) | 438-457 |

Number of pages | 20 |

Journal | IEEE Transactions on Information Theory |

Volume | 59 |

Issue number | 1 |

DOIs | |

Publication status | Published - Jan 7 2013 |

### Fingerprint

### All Science Journal Classification (ASJC) codes

- Information Systems
- Computer Science Applications
- Library and Information Sciences

### Cite this

*IEEE Transactions on Information Theory*,

*59*(1), 438-457. [6307868]. https://doi.org/10.1109/TIT.2012.2219171

**Properties of jeffreys mixture for markov sources.** / Takeuchi, Junnichi; Kawabata, Tsutomu; Barron, Andrew R.

Research output: Contribution to journal › Article

*IEEE Transactions on Information Theory*, vol. 59, no. 1, 6307868, pp. 438-457. https://doi.org/10.1109/TIT.2012.2219171

}

TY - JOUR

T1 - Properties of jeffreys mixture for markov sources

AU - Takeuchi, Junnichi

AU - Kawabata, Tsutomu

AU - Barron, Andrew R.

PY - 2013/1/7

Y1 - 2013/1/7

N2 - We discuss the properties of Jeffreys mixture for a Markov model. First, we show that a modified Jeffreys mixture asymptotically achieves the minimax coding regret for universal data compression, where we do not put any restriction on data sequences. Moreover, we give an approximation formula for the prediction probability of Jeffreys mixture for a Markov model. By this formula, it is revealed that the prediction probability by Jeffreys mixture for the Markov model with alphabet {0,1\} is not of the form (nx\s + α)/(ns + β), where nx\s is the number of occurrences of the symbol x following the context s {0,1} and ns = n0\s + n1\s. Moreover, we propose a method to compute our minimax strategy, which is a combination of a Monte Carlo method and the approximation formula, where the former is used for earlier stages in the data, while the latter is used for later stages.

AB - We discuss the properties of Jeffreys mixture for a Markov model. First, we show that a modified Jeffreys mixture asymptotically achieves the minimax coding regret for universal data compression, where we do not put any restriction on data sequences. Moreover, we give an approximation formula for the prediction probability of Jeffreys mixture for a Markov model. By this formula, it is revealed that the prediction probability by Jeffreys mixture for the Markov model with alphabet {0,1\} is not of the form (nx\s + α)/(ns + β), where nx\s is the number of occurrences of the symbol x following the context s {0,1} and ns = n0\s + n1\s. Moreover, we propose a method to compute our minimax strategy, which is a combination of a Monte Carlo method and the approximation formula, where the former is used for earlier stages in the data, while the latter is used for later stages.

UR - http://www.scopus.com/inward/record.url?scp=84871784306&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84871784306&partnerID=8YFLogxK

U2 - 10.1109/TIT.2012.2219171

DO - 10.1109/TIT.2012.2219171

M3 - Article

AN - SCOPUS:84871784306

VL - 59

SP - 438

EP - 457

JO - IEEE Transactions on Information Theory

JF - IEEE Transactions on Information Theory

SN - 0018-9448

IS - 1

M1 - 6307868

ER -