Abstract
We propose a local boosting method in classification problems borrowing from an idea of the local likelihood method. Our proposal, local boosting, includes a simple device for localization for computational feasibility. We proved the Bayes risk consistency of the local boosting in the framework of Probably approximately correct learning. Inspection of the proof provides a useful viewpoint for comparing ordinary boosting and local boosting with respect to the estimation error and the approximation error. Both boostingmethods have the Bayes risk consistency if their approximation errors decrease to zero. Compared to ordinary boosting, local boosting may perform better by controlling the trade-off between the estimation error and the approximation error. Ordinary boostingwith complicated base classifiers or other strong classification methods, including kernel machines, may have classification performance comparable to local boostingwith simple base classifiers, for example, decision stumps. Local boosting, however, has an advantage with respect to interpretability. Local boosting with simple base classifiers offers a simple way to specify which features are informative and how their values contribute to a classification rule even though locally. Several numerical studies on real data sets confirm these advantages of local boosting.
Original language | English |
---|---|
Pages (from-to) | 2792-2838 |
Number of pages | 47 |
Journal | Neural Computation |
Volume | 20 |
Issue number | 11 |
DOIs | |
Publication status | Published - Nov 1 2008 |
All Science Journal Classification (ASJC) codes
- Arts and Humanities (miscellaneous)
- Cognitive Neuroscience