Top-down decision tree boosting and its applications

Eiji Takimoto, Akira Maruoka

研究成果: 書籍/レポート タイプへの寄稿

抄録

Top-down algorithms such as C4.5 and CART for constructing decision trees are known to perform boosting, with the procedure of choosing classification rules at internal nodes regarded as the base learner. In this work, by introducing a notion of pseudo-entropy functions for measuring the loss of hypotheses, we give a new insight into this boosting scheme from an information-theoretic viewpoint: Whenever the base learner produces hypotheses with non-zero mutual information, the top-down algorithm reduces the conditional entropy (uncertainty) about the target function as the tree grows. Although its theoretical guarantee on its performance is worse than other popular boosting algorithms such as AdaBoost, the top-down algorithms can naturally treat multiclass classification problems. Furthermore we propose a base learner LIN that produces linear classification functions and carry out some experiments to examine the performance of the top-down algorithm with LIN as the base learner. The results show that the algorithm can sometimes perform as well as or better than AdaBoost.

本文言語英語
ホスト出版物のタイトルProgress in Discovery Science
出版社Springer Verlag
ページ327-337
ページ数11
ISBN(印刷版)3540433384, 9783540433385
DOI
出版ステータス出版済み - 2002
外部発表はい

出版物シリーズ

名前Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
2281
ISSN(印刷版)0302-9743
ISSN(電子版)1611-3349

!!!All Science Journal Classification (ASJC) codes

  • 理論的コンピュータサイエンス
  • コンピュータ サイエンス(全般)

フィンガープリント

「Top-down decision tree boosting and its applications」の研究トピックを掘り下げます。これらがまとまってユニークなフィンガープリントを構成します。

引用スタイル