Some improved sample complexity bounds in the probabilistic PAC learning model

Research output: Chapter in Book/Report/Conference proceedingConference contribution

1 Citation (Scopus)

Abstract

Various authors have proposed probabilistic extensions of Valiant's PAC learning model in which the target to be learned is a conditional (or unconditional) probability distribution. In this paper, we improve upon the best known upper bounds on the sample complexity of learning an important class of stochastic rules called ‘stochastic rules with finite partitioning’ with respect to the classic notion of distance between distributions, the Kullback-Leibler divergence (KL-divergence). In particular, we improve the upper bound of order O(1/e2) due to Abe, Takeuchi, and Warmuth [2] to a bound of order O(1/e). Our proof technique is interesting for at least two reasons: First, previously known upper bounds with respect to the KL-divergence were obtained using the uniform convergence technique, while our improved upper bound is obtained by taking advantage of the properties of the maximum likelihood estimator. Second, our proof relies on the fact that only a linear number of examples are required in order to distinguish a true parametric model from a bad parametric model. The latter notion is apparently related to the notion of discrimination proposed and studied by Yamanishi, but the exact relationship is yet to be determined.

Original languageEnglish
Title of host publicationAlgorithmic Learning Theory - 3rd Workshop, ALT 1992, Proceedings
EditorsShuji Doshita, Koichi Furukawa, Klaus P. Jantke, Toyaki Nishida
PublisherSpringer Verlag
Pages208-219
Number of pages12
ISBN (Print)9783540573692
DOIs
Publication statusPublished - Jan 1 1993
Event3rd Workshop on Algorithmic Learning Theory, ALT 1992 - Tokyo, Japan
Duration: Oct 20 1992Oct 22 1992

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume743 LNAI
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Other

Other3rd Workshop on Algorithmic Learning Theory, ALT 1992
CountryJapan
CityTokyo
Period10/20/9210/22/92

All Science Journal Classification (ASJC) codes

  • Theoretical Computer Science
  • Computer Science(all)

Fingerprint Dive into the research topics of 'Some improved sample complexity bounds in the probabilistic PAC learning model'. Together they form a unique fingerprint.

  • Cite this

    Takeuchi, J. I. (1993). Some improved sample complexity bounds in the probabilistic PAC learning model. In S. Doshita, K. Furukawa, K. P. Jantke, & T. Nishida (Eds.), Algorithmic Learning Theory - 3rd Workshop, ALT 1992, Proceedings (pp. 208-219). (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Vol. 743 LNAI). Springer Verlag. https://doi.org/10.1007/3-540-57369-0_40