On-line estimation of hidden Markov model parameters

Jun Mizuno, Tatsuya Watanabe, Kazuya Ueki, Kazuyuki Amano, Eiji Takimoto, Akira Maruoka

研究成果: 著書/レポートタイプへの貢献会議での発言

9 引用 (Scopus)

抄録

In modeling various signals such as the speech signal by using the Hidden Markov Model (HMM), it is often required to adapt not only to the inherent nonstationarity of the signal, but to changes of sources (speakers) who yield the signal. The well known Baum-Welch algorithm tries to adjust HMM so as to optimize the fit between the model and the signal observed. In this paper we develop an algorithm, which we call the on-line Baum-Welch algorithm, by incorporating the learning rate into the off-line Baum-Welch algorithm. The algorithm performs in a series of trials. In each trial the algorithm somehow produces an HMM Mt, then receives a symbol sequence wt, incurring loss -ln Pr(wt|Mt) which is the negative log-likelihood of the HMM Mt evaluated at wt. The performance of the algorithm is measured by the additional total loss, which is called the regret, of the algorithm over the total loss of a standard algorithm, where the standard algorithm is taken to be a criterion for measuring the relative loss. We take the off-line Baum-Welch algorithm as such a standard algorithm. To evaluate the performance of an algorithm, we take the Gradient Descent algorithm. Our experiments show that the on-line Baum-Welch algorithm performs well as compared to the Gradient Descent algorithm. We carry out the experiments not only for artificial data, but for some reasonably realistic data which is made by transforming acoustic waveforms to symbol sequences through the vector quantization method. The results show that the on-line Baum-Welch algorithm adapts the change of speakers very well.

元の言語英語
ホスト出版物のタイトルDiscovery Science - 3rd International Conference, DS 2000, Proceedings
出版者Springer Verlag
ページ155-169
ページ数15
ISBN(印刷物)9783540413523
出版物ステータス出版済み - 1 1 2000
外部発表Yes
イベント3rd International Conference on Discovery Science, DS 2000 - Kyoto, 日本
継続期間: 12 4 200012 6 2000

出版物シリーズ

名前Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
1967
ISSN(印刷物)0302-9743
ISSN(電子版)1611-3349

その他

その他3rd International Conference on Discovery Science, DS 2000
日本
Kyoto
期間12/4/0012/6/00

Fingerprint

Hidden Markov models
Markov Model
Descent Algorithm
Gradient Algorithm
Gradient Descent
Learning Rate
Nonstationarity
Regret
Vector Quantization
Line
Speech Signal
Vector quantization
Waveform
Experiment
Likelihood
Acoustics

All Science Journal Classification (ASJC) codes

  • Theoretical Computer Science
  • Computer Science(all)

これを引用

Mizuno, J., Watanabe, T., Ueki, K., Amano, K., Takimoto, E., & Maruoka, A. (2000). On-line estimation of hidden Markov model parameters. : Discovery Science - 3rd International Conference, DS 2000, Proceedings (pp. 155-169). (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); 巻数 1967). Springer Verlag.

On-line estimation of hidden Markov model parameters. / Mizuno, Jun; Watanabe, Tatsuya; Ueki, Kazuya; Amano, Kazuyuki; Takimoto, Eiji; Maruoka, Akira.

Discovery Science - 3rd International Conference, DS 2000, Proceedings. Springer Verlag, 2000. p. 155-169 (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); 巻 1967).

研究成果: 著書/レポートタイプへの貢献会議での発言

Mizuno, J, Watanabe, T, Ueki, K, Amano, K, Takimoto, E & Maruoka, A 2000, On-line estimation of hidden Markov model parameters. : Discovery Science - 3rd International Conference, DS 2000, Proceedings. Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 巻. 1967, Springer Verlag, pp. 155-169, 3rd International Conference on Discovery Science, DS 2000, Kyoto, 日本, 12/4/00.
Mizuno J, Watanabe T, Ueki K, Amano K, Takimoto E, Maruoka A. On-line estimation of hidden Markov model parameters. : Discovery Science - 3rd International Conference, DS 2000, Proceedings. Springer Verlag. 2000. p. 155-169. (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)).
Mizuno, Jun ; Watanabe, Tatsuya ; Ueki, Kazuya ; Amano, Kazuyuki ; Takimoto, Eiji ; Maruoka, Akira. / On-line estimation of hidden Markov model parameters. Discovery Science - 3rd International Conference, DS 2000, Proceedings. Springer Verlag, 2000. pp. 155-169 (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)).
@inproceedings{688668affb5c40fcb8e247f7b13c9d1a,
title = "On-line estimation of hidden Markov model parameters",
abstract = "In modeling various signals such as the speech signal by using the Hidden Markov Model (HMM), it is often required to adapt not only to the inherent nonstationarity of the signal, but to changes of sources (speakers) who yield the signal. The well known Baum-Welch algorithm tries to adjust HMM so as to optimize the fit between the model and the signal observed. In this paper we develop an algorithm, which we call the on-line Baum-Welch algorithm, by incorporating the learning rate into the off-line Baum-Welch algorithm. The algorithm performs in a series of trials. In each trial the algorithm somehow produces an HMM Mt, then receives a symbol sequence wt, incurring loss -ln Pr(wt|Mt) which is the negative log-likelihood of the HMM Mt evaluated at wt. The performance of the algorithm is measured by the additional total loss, which is called the regret, of the algorithm over the total loss of a standard algorithm, where the standard algorithm is taken to be a criterion for measuring the relative loss. We take the off-line Baum-Welch algorithm as such a standard algorithm. To evaluate the performance of an algorithm, we take the Gradient Descent algorithm. Our experiments show that the on-line Baum-Welch algorithm performs well as compared to the Gradient Descent algorithm. We carry out the experiments not only for artificial data, but for some reasonably realistic data which is made by transforming acoustic waveforms to symbol sequences through the vector quantization method. The results show that the on-line Baum-Welch algorithm adapts the change of speakers very well.",
author = "Jun Mizuno and Tatsuya Watanabe and Kazuya Ueki and Kazuyuki Amano and Eiji Takimoto and Akira Maruoka",
year = "2000",
month = "1",
day = "1",
language = "English",
isbn = "9783540413523",
series = "Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)",
publisher = "Springer Verlag",
pages = "155--169",
booktitle = "Discovery Science - 3rd International Conference, DS 2000, Proceedings",
address = "Germany",

}

TY - GEN

T1 - On-line estimation of hidden Markov model parameters

AU - Mizuno, Jun

AU - Watanabe, Tatsuya

AU - Ueki, Kazuya

AU - Amano, Kazuyuki

AU - Takimoto, Eiji

AU - Maruoka, Akira

PY - 2000/1/1

Y1 - 2000/1/1

N2 - In modeling various signals such as the speech signal by using the Hidden Markov Model (HMM), it is often required to adapt not only to the inherent nonstationarity of the signal, but to changes of sources (speakers) who yield the signal. The well known Baum-Welch algorithm tries to adjust HMM so as to optimize the fit between the model and the signal observed. In this paper we develop an algorithm, which we call the on-line Baum-Welch algorithm, by incorporating the learning rate into the off-line Baum-Welch algorithm. The algorithm performs in a series of trials. In each trial the algorithm somehow produces an HMM Mt, then receives a symbol sequence wt, incurring loss -ln Pr(wt|Mt) which is the negative log-likelihood of the HMM Mt evaluated at wt. The performance of the algorithm is measured by the additional total loss, which is called the regret, of the algorithm over the total loss of a standard algorithm, where the standard algorithm is taken to be a criterion for measuring the relative loss. We take the off-line Baum-Welch algorithm as such a standard algorithm. To evaluate the performance of an algorithm, we take the Gradient Descent algorithm. Our experiments show that the on-line Baum-Welch algorithm performs well as compared to the Gradient Descent algorithm. We carry out the experiments not only for artificial data, but for some reasonably realistic data which is made by transforming acoustic waveforms to symbol sequences through the vector quantization method. The results show that the on-line Baum-Welch algorithm adapts the change of speakers very well.

AB - In modeling various signals such as the speech signal by using the Hidden Markov Model (HMM), it is often required to adapt not only to the inherent nonstationarity of the signal, but to changes of sources (speakers) who yield the signal. The well known Baum-Welch algorithm tries to adjust HMM so as to optimize the fit between the model and the signal observed. In this paper we develop an algorithm, which we call the on-line Baum-Welch algorithm, by incorporating the learning rate into the off-line Baum-Welch algorithm. The algorithm performs in a series of trials. In each trial the algorithm somehow produces an HMM Mt, then receives a symbol sequence wt, incurring loss -ln Pr(wt|Mt) which is the negative log-likelihood of the HMM Mt evaluated at wt. The performance of the algorithm is measured by the additional total loss, which is called the regret, of the algorithm over the total loss of a standard algorithm, where the standard algorithm is taken to be a criterion for measuring the relative loss. We take the off-line Baum-Welch algorithm as such a standard algorithm. To evaluate the performance of an algorithm, we take the Gradient Descent algorithm. Our experiments show that the on-line Baum-Welch algorithm performs well as compared to the Gradient Descent algorithm. We carry out the experiments not only for artificial data, but for some reasonably realistic data which is made by transforming acoustic waveforms to symbol sequences through the vector quantization method. The results show that the on-line Baum-Welch algorithm adapts the change of speakers very well.

UR - http://www.scopus.com/inward/record.url?scp=84974667029&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84974667029&partnerID=8YFLogxK

M3 - Conference contribution

AN - SCOPUS:84974667029

SN - 9783540413523

T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)

SP - 155

EP - 169

BT - Discovery Science - 3rd International Conference, DS 2000, Proceedings

PB - Springer Verlag

ER -