Boosting method for local learning in statistical pattern recognition

Masanori Kawakita, Shinto Eguchi

Research output: Contribution to journalArticle

6 Citations (Scopus)

Abstract

We propose a local boosting method in classification problems borrowing from an idea of the local likelihood method. Our proposal, local boosting, includes a simple device for localization for computational feasibility. We proved the Bayes risk consistency of the local boosting in the framework of Probably approximately correct learning. Inspection of the proof provides a useful viewpoint for comparing ordinary boosting and local boosting with respect to the estimation error and the approximation error. Both boostingmethods have the Bayes risk consistency if their approximation errors decrease to zero. Compared to ordinary boosting, local boosting may perform better by controlling the trade-off between the estimation error and the approximation error. Ordinary boostingwith complicated base classifiers or other strong classification methods, including kernel machines, may have classification performance comparable to local boostingwith simple base classifiers, for example, decision stumps. Local boosting, however, has an advantage with respect to interpretability. Local boosting with simple base classifiers offers a simple way to specify which features are informative and how their values contribute to a classification rule even though locally. Several numerical studies on real data sets confirm these advantages of local boosting.

Original languageEnglish
Pages (from-to)2792-2838
Number of pages47
JournalNeural Computation
Volume20
Issue number11
DOIs
Publication statusPublished - Nov 1 2008

Fingerprint

Learning
Equipment and Supplies
Recognition (Psychology)
Pattern Recognition
Approximation
Classifier
Datasets

All Science Journal Classification (ASJC) codes

  • Cognitive Neuroscience

Cite this

Boosting method for local learning in statistical pattern recognition. / Kawakita, Masanori; Eguchi, Shinto.

In: Neural Computation, Vol. 20, No. 11, 01.11.2008, p. 2792-2838.

Research output: Contribution to journalArticle

Kawakita, Masanori ; Eguchi, Shinto. / Boosting method for local learning in statistical pattern recognition. In: Neural Computation. 2008 ; Vol. 20, No. 11. pp. 2792-2838.
@article{d6b796f486274c50823508870c798bc9,
title = "Boosting method for local learning in statistical pattern recognition",
abstract = "We propose a local boosting method in classification problems borrowing from an idea of the local likelihood method. Our proposal, local boosting, includes a simple device for localization for computational feasibility. We proved the Bayes risk consistency of the local boosting in the framework of Probably approximately correct learning. Inspection of the proof provides a useful viewpoint for comparing ordinary boosting and local boosting with respect to the estimation error and the approximation error. Both boostingmethods have the Bayes risk consistency if their approximation errors decrease to zero. Compared to ordinary boosting, local boosting may perform better by controlling the trade-off between the estimation error and the approximation error. Ordinary boostingwith complicated base classifiers or other strong classification methods, including kernel machines, may have classification performance comparable to local boostingwith simple base classifiers, for example, decision stumps. Local boosting, however, has an advantage with respect to interpretability. Local boosting with simple base classifiers offers a simple way to specify which features are informative and how their values contribute to a classification rule even though locally. Several numerical studies on real data sets confirm these advantages of local boosting.",
author = "Masanori Kawakita and Shinto Eguchi",
year = "2008",
month = "11",
day = "1",
doi = "10.1162/neco.2008.06-07-549",
language = "English",
volume = "20",
pages = "2792--2838",
journal = "Neural Computation",
issn = "0899-7667",
publisher = "MIT Press Journals",
number = "11",

}

TY - JOUR

T1 - Boosting method for local learning in statistical pattern recognition

AU - Kawakita, Masanori

AU - Eguchi, Shinto

PY - 2008/11/1

Y1 - 2008/11/1

N2 - We propose a local boosting method in classification problems borrowing from an idea of the local likelihood method. Our proposal, local boosting, includes a simple device for localization for computational feasibility. We proved the Bayes risk consistency of the local boosting in the framework of Probably approximately correct learning. Inspection of the proof provides a useful viewpoint for comparing ordinary boosting and local boosting with respect to the estimation error and the approximation error. Both boostingmethods have the Bayes risk consistency if their approximation errors decrease to zero. Compared to ordinary boosting, local boosting may perform better by controlling the trade-off between the estimation error and the approximation error. Ordinary boostingwith complicated base classifiers or other strong classification methods, including kernel machines, may have classification performance comparable to local boostingwith simple base classifiers, for example, decision stumps. Local boosting, however, has an advantage with respect to interpretability. Local boosting with simple base classifiers offers a simple way to specify which features are informative and how their values contribute to a classification rule even though locally. Several numerical studies on real data sets confirm these advantages of local boosting.

AB - We propose a local boosting method in classification problems borrowing from an idea of the local likelihood method. Our proposal, local boosting, includes a simple device for localization for computational feasibility. We proved the Bayes risk consistency of the local boosting in the framework of Probably approximately correct learning. Inspection of the proof provides a useful viewpoint for comparing ordinary boosting and local boosting with respect to the estimation error and the approximation error. Both boostingmethods have the Bayes risk consistency if their approximation errors decrease to zero. Compared to ordinary boosting, local boosting may perform better by controlling the trade-off between the estimation error and the approximation error. Ordinary boostingwith complicated base classifiers or other strong classification methods, including kernel machines, may have classification performance comparable to local boostingwith simple base classifiers, for example, decision stumps. Local boosting, however, has an advantage with respect to interpretability. Local boosting with simple base classifiers offers a simple way to specify which features are informative and how their values contribute to a classification rule even though locally. Several numerical studies on real data sets confirm these advantages of local boosting.

UR - http://www.scopus.com/inward/record.url?scp=55749096877&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=55749096877&partnerID=8YFLogxK

U2 - 10.1162/neco.2008.06-07-549

DO - 10.1162/neco.2008.06-07-549

M3 - Article

C2 - 18533822

AN - SCOPUS:55749096877

VL - 20

SP - 2792

EP - 2838

JO - Neural Computation

JF - Neural Computation

SN - 0899-7667

IS - 11

ER -