ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 0903.4807
90
6

Sparse classification boundaries

27 March 2009
Yu. I. Ingster
C. Pouet
Alexandre B. Tsybakov
ArXivPDFHTML
Abstract

Given a training sample of size mmm from a ddd-dimensional population, we wish to allocate a new observation Z∈RdZ\in \R^dZ∈Rd to this population or to the noise. We suppose that the difference between the distribution of the population and that of the noise is only in a shift, which is a sparse vector. For the Gaussian noise, fixed sample size mmm, and the dimension ddd that tends to infinity, we obtain the sharp classification boundary and we propose classifiers attaining this boundary. We also give extensions of this result to the case where the sample size mmm depends on ddd and satisfies the condition (log⁡m)/log⁡d→γ(\log m)/\log d \to \gamma(logm)/logd→γ, 0≤γ<10\le \gamma<10≤γ<1, and to the case of non-Gaussian noise satisfying the Cram\ér condition.

View on arXiv
Comments on this paper