58
5
v1v2 (latest)

Classification via local multi-resolution projections

Abstract

We focus on the supervised binary classification problem, which consists in guessing the label YY associated to a co-variate XRdX \in \R^d, given a set of nn independent and identically distributed co-variates and associated labels (Xi,Yi)(X_i,Y_i). We assume that the law of the random vector (X,Y)(X,Y) is unknown and the marginal law of XX admits a density supported on a set \A\A. In the particular case of plug-in classifiers, solving the classification problem boils down to the estimation of the regression function η(X)=\Exp[YX]\eta(X) = \Exp[Y|X]. Assuming first \A\A to be known, we show how it is possible to construct an estimator of η\eta by localized projections onto a multi-resolution analysis (MRA). In a second step, we show how this estimation procedure generalizes to the case where \A\A is unknown. Interestingly, this novel estimation procedure presents similar theoretical performances as the celebrated local-polynomial estimator (LPE). In addition, it benefits from the lattice structure of the underlying MRA and thus outperforms the LPE from a computational standpoint, which turns out to be a crucial feature in many practical applications. Finally, we prove that the associated plug-in classifier can reach super-fast rates under a margin assumption.

View on arXiv
Comments on this paper