ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2310.11876
17
3

SQ Lower Bounds for Learning Mixtures of Linear Classifiers

18 October 2023
Ilias Diakonikolas
D. Kane
Yuxin Sun
ArXivPDFHTML
Abstract

We study the problem of learning mixtures of linear classifiers under Gaussian covariates. Given sample access to a mixture of rrr distributions on Rn\mathbb{R}^nRn of the form (x,yℓ)(\mathbf{x},y_{\ell})(x,yℓ​), ℓ∈[r]\ell\in [r]ℓ∈[r], where x∼N(0,In)\mathbf{x}\sim\mathcal{N}(0,\mathbf{I}_n)x∼N(0,In​) and yℓ=sign(⟨vℓ,x⟩)y_\ell=\mathrm{sign}(\langle\mathbf{v}_\ell,\mathbf{x}\rangle)yℓ​=sign(⟨vℓ​,x⟩) for an unknown unit vector vℓ\mathbf{v}_\ellvℓ​, the goal is to learn the underlying distribution in total variation distance. Our main result is a Statistical Query (SQ) lower bound suggesting that known algorithms for this problem are essentially best possible, even for the special case of uniform mixtures. In particular, we show that the complexity of any SQ algorithm for the problem is npoly(1/Δ)log⁡(r)n^{\mathrm{poly}(1/\Delta) \log(r)}npoly(1/Δ)log(r), where Δ\DeltaΔ is a lower bound on the pairwise ℓ2\ell_2ℓ2​-separation between the vℓ\mathbf{v}_\ellvℓ​'s. The key technical ingredient underlying our result is a new construction of spherical designs that may be of independent interest.

View on arXiv
Comments on this paper