ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2106.09689
13
21

Statistical Query Lower Bounds for List-Decodable Linear Regression

17 June 2021
Ilias Diakonikolas
D. Kane
Ankit Pensia
Thanasis Pittas
Alistair Stewart
ArXivPDFHTML
Abstract

We study the problem of list-decodable linear regression, where an adversary can corrupt a majority of the examples. Specifically, we are given a set TTT of labeled examples (x,y)∈Rd×R(x, y) \in \mathbb{R}^d \times \mathbb{R}(x,y)∈Rd×R and a parameter 0<α<1/20< \alpha <1/20<α<1/2 such that an α\alphaα-fraction of the points in TTT are i.i.d. samples from a linear regression model with Gaussian covariates, and the remaining (1−α)(1-\alpha)(1−α)-fraction of the points are drawn from an arbitrary noise distribution. The goal is to output a small list of hypothesis vectors such that at least one of them is close to the target regression vector. Our main result is a Statistical Query (SQ) lower bound of dpoly(1/α)d^{\mathrm{poly}(1/\alpha)}dpoly(1/α) for this problem. Our SQ lower bound qualitatively matches the performance of previously developed algorithms, providing evidence that current upper bounds for this task are nearly best possible.

View on arXiv
Comments on this paper