Classification via local multi-resolution projections

We focus on the supervised binary classification problem, which consists in guessing the label associated to a co-variate , given a set of independent and identically distributed co-variates and associated labels . We assume that the law of the random vector is unknown and the marginal law of admits a density supported on a set . In the particular case of plug-in classifiers, solving the regression problem boils down to the estimation of the regression function . Assuming first to be known, we show how it is possible to construct an estimator of by localized projections onto a multi-resolution analysis (MRA). In a second step, we show how this estimation procedure generalizes to the case where is unknown. Interestingly, this novel estimation procedure presents similar theoretical performances as the celebrated local-polynomial estimator (LPE). In addition, it benefits from the lattice structure of the underlying MRA and thus outperforms the LPE from a computational standpoint, which turns out to be a crucial feature in many practical applications. Finally, we prove that the associated plug-in classifier can reach super-fast rates under a margin assumption.
View on arXiv