ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2206.04589
26
1

Optimal SQ Lower Bounds for Robustly Learning Discrete Product Distributions and Ising Models

9 June 2022
Ilias Diakonikolas
D. Kane
Yuxin Sun
ArXivPDFHTML
Abstract

We establish optimal Statistical Query (SQ) lower bounds for robustly learning certain families of discrete high-dimensional distributions. In particular, we show that no efficient SQ algorithm with access to an ϵ\epsilonϵ-corrupted binary product distribution can learn its mean within ℓ2\ell_2ℓ2​-error o(ϵlog⁡(1/ϵ))o(\epsilon \sqrt{\log(1/\epsilon)})o(ϵlog(1/ϵ)​). Similarly, we show that no efficient SQ algorithm with access to an ϵ\epsilonϵ-corrupted ferromagnetic high-temperature Ising model can learn the model to total variation distance o(ϵlog⁡(1/ϵ))o(\epsilon \log(1/\epsilon))o(ϵlog(1/ϵ)). Our SQ lower bounds match the error guarantees of known algorithms for these problems, providing evidence that current upper bounds for these tasks are best possible. At the technical level, we develop a generic SQ lower bound for discrete high-dimensional distributions starting from low dimensional moment matching constructions that we believe will find other applications. Additionally, we introduce new ideas to analyze these moment-matching constructions for discrete univariate distributions.

View on arXiv
Comments on this paper