On the sample complexity of consistent learning with one-sided error

Eiji Takimoto, Akira Maruoka

Research output: Contribution to journalConference articlepeer-review

Abstract

Although consistent learning is sufficient for PAC-learning, it has not been found what strategy makes learning more efficient, especially on the sample complexity, i.e., the number of examples required. For the first step towards this problem, classes that have consistent learning algorithms with one-sided error are considered. A combinatorial quantity called maximal particle sets is introduced, and an upper bound of the sample complexity of consistent learning with one-sided error is obtained in terms of maximal particle sets. For the class of n-dimensional axis-parallel rectangles, one of those classes that are consistently learnable with one-sided error, the cardinality of the maximal particle set is estimated and O(d/ε+1/ε log 1/δ) upper bound of the learning algorithm for the class is obtained. This bound improves the bounds due to Blumer et al. and meets the lower bound within a constant factor.

Original languageEnglish
Pages (from-to)518-525
Number of pages8
JournalIEICE Transactions on Information and Systems
VolumeE78-D
Issue number5
Publication statusPublished - May 1 1995
Externally publishedYes
EventProceedings of the IEICE Transaction on Information and Systems - Tokyo, Jpn
Duration: Nov 1 1993Nov 1 1993

All Science Journal Classification (ASJC) codes

  • Software
  • Hardware and Architecture
  • Computer Vision and Pattern Recognition
  • Electrical and Electronic Engineering
  • Artificial Intelligence

Fingerprint Dive into the research topics of 'On the sample complexity of consistent learning with one-sided error'. Together they form a unique fingerprint.

Cite this