Improved sample complexity bounds for parameter estimation

研究成果: Contribution to journalConference article査読

抄録

Various authors have proposed probabilistic extensions of Valiant's PAC (Probably Approximately Correct) learning model in which the target to be learned is a (conditional) probability distribution. In this paper, we improve upon the best known upper bounds on the sample complexity of the parameter estimation part of the learning problem for distributions and stochastic rules over a finite domain with respect to the Kullback-Leibler divergence (KL-divergence). In particular, we improve the upper bound of order O(1/ε2) due to Abe, Takeuchi, and Warmuth to a bound of order O(1/ε). In obtaining our results, we made use of the properties of a specific estimator (slightly modified maximum likelihood estimator) with respect to the KL-divergence, while previously known upper bounds were obtained using the uniform convergence technique.

本文言語英語
ページ(範囲)526-531
ページ数6
ジャーナルIEICE Transactions on Information and Systems
E78-D
5
出版ステータス出版済み - 5 1 1995
外部発表はい
イベントProceedings of the IEICE Transaction on Information and Systems - Tokyo, Jpn
継続期間: 11 1 199311 1 1993

All Science Journal Classification (ASJC) codes

  • Software
  • Hardware and Architecture
  • Computer Vision and Pattern Recognition
  • Electrical and Electronic Engineering
  • Artificial Intelligence

フィンガープリント 「Improved sample complexity bounds for parameter estimation」の研究トピックを掘り下げます。これらがまとまってユニークなフィンガープリントを構成します。

引用スタイル