ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1408.2714
46
15
v1v2 (latest)

Learning From Non-iid Data: Fast Rates for the One-vs-All Multiclass Plug-in Classifiers

12 August 2014
Vu C. Dinh
L. Ho
Cuong V Nguyen
D. M. Nguyen
Binh T. Nguyen
    VLM
ArXiv (abs)PDFHTML
Abstract

We prove new fast learning rates for the one-vs-all multiclass plug-in classifiers trained either from exponentially strongly mixing data or from data generated by a converging drifting distribution. These are two typical scenarios where training data are not iid. The learning rates are obtained under a multiclass version of Tsybakov's margin assumption, a type of low-noise assumption, and do not depend on the number of classes. Our results are general and include a previous result for binary-class plug-in classifiers with iid data as a special case. In contrast to previous works for least squares SVMs under the binary-class setting, our results retain the optimal learning rate in the iid case.

View on arXiv
Comments on this paper