Université Paris 6
Pierre et Marie Curie
Université Paris 7
Denis Diderot

CNRS U.M.R. 7599
``Probabilités et Modèles Aléatoires''

"Universal" aggregation rules with exact bias bounds

Auteur(s):

Code(s) de Classification MSC:

Résumé: We present in this paper a "progressive mixture rule" to aggregate a countable family of "primary" estimators of the sample distribution of an exchangeable statistical experiment, based on an idea first introduced by Andrew Barron to aggregate fixed distributions. When the mean risk is measured using the Kullback Leibler divergence, this rule has an exact bias bound and in the same time a complexity bound that is optimal in order (when there are not too many primary estimators with low variances). It is "universal" in the sense that it works without restrictive assumptions on the true sample distribution (in other words, the bias term is not assumed to be small). We give applications to adaptive histograms, using an effective aggregation algorithm coming from the works of Willems, Shtarkov and Tjalkens in universal data compression. We also discuss least square regression, taking the well known example of adaptive regression in Besov spaces. To deal with the regression case, we use a method of proof borrowed from our paper about Gibbs estimators. We comment on the difference between our aggregation rule and selection rules (where only one primary estimator is selected). We show that it is in general impossible to get an exact bias bound using a selection rule. We close the paper by a description of a Monte-Carlo approximate computation of the progressive mixture rule by some kind of simulated annealing algorithm.

Mots Clés: Aggregation of Estimators ; Adaptive Density Estimation ; Adaptive Histograms ; Adaptive Least Square Regression ; The Context-Tree Weighting Method, ; Mean Kullback Risk ; Besov spaces ; Monte-Carlo Simulation of Posterior Distributions

Date: 1999-06-21

Prépublication numéro: PMA-510