ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2404.12376
42
0

Matching the Statistical Query Lower Bound for k-sparse Parity Problems with Stochastic Gradient Descent

18 April 2024
Yiwen Kou
Zixiang Chen
Quanquan Gu
Sham Kakade
ArXivPDFHTML
Abstract

The kkk-parity problem is a classical problem in computational complexity and algorithmic theory, serving as a key benchmark for understanding computational classes. In this paper, we solve the kkk-parity problem with stochastic gradient descent (SGD) on two-layer fully-connected neural networks. We demonstrate that SGD can efficiently solve the kkk-sparse parity problem on a ddd-dimensional hypercube (k≤O(d)k\le O(\sqrt{d})k≤O(d​)) with a sample complexity of O~(dk−1)\tilde{O}(d^{k-1})O~(dk−1) using 2Θ(k)2^{\Theta(k)}2Θ(k) neurons, thus matching the established Ω(dk)\Omega(d^{k})Ω(dk) lower bounds of Statistical Query (SQ) models. Our theoretical analysis begins by constructing a good neural network capable of correctly solving the kkk-parity problem. We then demonstrate how a trained neural network with SGD can effectively approximate this good network, solving the kkk-parity problem with small statistical errors. Our theoretical results and findings are supported by empirical evidence, showcasing the efficiency and efficacy of our approach.

View on arXiv
Comments on this paper