48
15

Learning From Non-iid Data: Fast Rates for the One-vs-All Multiclass Plug-in Classifiers

Abstract

We prove fast learning rates for the one-vs-all multiclass plug-in classifiers trained passively either from exponentially strongly mixing data or from data generated by a converging drifting distribution. These are two typical scenarios where training data are not iid. Similar to previous works, we assume that the test data distribution satisfies a multiclass version of Tsybakov's margin assumption, which is a type of low-noise data assumption. Our results are general and include a previous result for binary-class plug-in classifiers with iid data as a special case.

View on arXiv
Comments on this paper