Université Paris 6
Pierre et Marie Curie
Université Paris 7
Denis Diderot

CNRS U.M.R. 7599
``Probabilités et Modèles Aléatoires''

On the Bayes-risk consistency of regularized boosting methods

Auteur(s):

Code(s) de Classification MSC:

Résumé: The probability of error of classification methods based on convex combinations of simple base classifiers by "boosting" algorithms is investigated. The main result of the paper is that certain regularized boosting algorithms provide Bayes-risk consistent classifiers under the only assumption that the Bayes classifier may be approximated by a convex combination of the base classifiers. Non-asymptotic distribution-free bounds are also developed which offer interesting new insight into how boosting works and help explain its success in practical classification problems.

Mots Clés: boosting ; overfitting ; data classification ; Bayes-risk consistency ; regularized methods ; convex cost functions ; penalized model selection ; empirical processes

Date: 2003-03-06

Prépublication numéro: PMA-801

Front pages.

Postscript file : PMA-801.ps