TY - GEN
T1 - Boosting versus covering
AU - Hatano, Kohei
AU - Warmuth, Manfred K.
PY - 2004/1/1
Y1 - 2004/1/1
N2 - We investigate improvements of AdaBoost that can exploit the fact that the weak hypotheses are one-sided, i.e. either all its positive (or negative) predictions are correct. In particular, for any set of m labeled examples consistent with a disjunction of literals (which are one-sided in this case), AdaBoost constructs a consistent hypothesis by using O(2 logm) iterations. On the other hand, a greedy set covering algorithm finds a consistent hypothesis of size O( logm). Our primary question is whether there is a simple boosting algorithm that performs as well as the greedy set covering. We first show that InfoBoost, a modification of AdaBoost proposed by Aslam for a different purpose, does perform as well as the greedy set covering algorithm. We then show that AdaBoost requires Ω(2 logm) iterations for learning -literal disjunctions. We achieve this with an adversary construction and as well as in simple experiments based on artificial data. Further we give a variant called SemiBoost that can handle the degenerate case when the given examples all have the same label. We conclude by showing that SemiBoost can be used to produce small conjunctions as well.
AB - We investigate improvements of AdaBoost that can exploit the fact that the weak hypotheses are one-sided, i.e. either all its positive (or negative) predictions are correct. In particular, for any set of m labeled examples consistent with a disjunction of literals (which are one-sided in this case), AdaBoost constructs a consistent hypothesis by using O(2 logm) iterations. On the other hand, a greedy set covering algorithm finds a consistent hypothesis of size O( logm). Our primary question is whether there is a simple boosting algorithm that performs as well as the greedy set covering. We first show that InfoBoost, a modification of AdaBoost proposed by Aslam for a different purpose, does perform as well as the greedy set covering algorithm. We then show that AdaBoost requires Ω(2 logm) iterations for learning -literal disjunctions. We achieve this with an adversary construction and as well as in simple experiments based on artificial data. Further we give a variant called SemiBoost that can handle the degenerate case when the given examples all have the same label. We conclude by showing that SemiBoost can be used to produce small conjunctions as well.
UR - http://www.scopus.com/inward/record.url?scp=84860966920&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84860966920&partnerID=8YFLogxK
M3 - Conference contribution
AN - SCOPUS:84860966920
SN - 0262201526
SN - 9780262201520
T3 - Advances in Neural Information Processing Systems
BT - Advances in Neural Information Processing Systems 16 - Proceedings of the 2003 Conference, NIPS 2003
PB - Neural information processing systems foundation
T2 - 17th Annual Conference on Neural Information Processing Systems, NIPS 2003
Y2 - 8 December 2003 through 13 December 2003
ER -