ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2304.02599
27
8

Query lower bounds for log-concave sampling

5 April 2023
Sinho Chewi
Jaume de Dios Pont
Jerry Li
Chen Lu
Shyam Narayanan
ArXivPDFHTML
Abstract

Log-concave sampling has witnessed remarkable algorithmic advances in recent years, but the corresponding problem of proving lower bounds for this task has remained elusive, with lower bounds previously known only in dimension one. In this work, we establish the following query lower bounds: (1) sampling from strongly log-concave and log-smooth distributions in dimension d≥2d\ge 2d≥2 requires Ω(log⁡κ)\Omega(\log \kappa)Ω(logκ) queries, which is sharp in any constant dimension, and (2) sampling from Gaussians in dimension ddd (hence also from general log-concave and log-smooth distributions in dimension ddd) requires Ω~(min⁡(κlog⁡d,d))\widetilde \Omega(\min(\sqrt\kappa \log d, d))Ω(min(κ​logd,d)) queries, which is nearly sharp for the class of Gaussians. Here κ\kappaκ denotes the condition number of the target distribution. Our proofs rely upon (1) a multiscale construction inspired by work on the Kakeya conjecture in geometric measure theory, and (2) a novel reduction that demonstrates that block Krylov algorithms are optimal for this problem, as well as connections to lower bound techniques based on Wishart matrices developed in the matrix-vector query literature.

View on arXiv
Comments on this paper