67
0

Fundamental Limits of Learning High-dimensional Simplices in Noisy Regimes

Main:11 Pages
Bibliography:4 Pages
Appendix:29 Pages
Abstract

In this paper, we establish sample complexity bounds for learning high-dimensional simplices in RK\mathbb{R}^K from noisy data. Specifically, we consider nn i.i.d. samples uniformly drawn from an unknown simplex in RK\mathbb{R}^K, each corrupted by additive Gaussian noise of unknown variance. We prove an algorithm exists that, with high probability, outputs a simplex within 2\ell_2 or total variation (TV) distance at most ε\varepsilon from the true simplex, provided n(K2/ε2)eO(K/SNR2)n \ge (K^2/\varepsilon^2) e^{\mathcal{O}(K/\mathrm{SNR}^2)}, where SNR\mathrm{SNR} is the signal-to-noise ratio. Extending our prior work~\citep{saberi2023sample}, we derive new information-theoretic lower bounds, showing that simplex estimation within TV distance ε\varepsilon requires at least nΩ(K3σ2/ε2+K/ε)n \ge \Omega(K^3 \sigma^2/\varepsilon^2 + K/\varepsilon) samples, where σ2\sigma^2 denotes the noise variance. In the noiseless scenario, our lower bound nΩ(K/ε)n \ge \Omega(K/\varepsilon) matches known upper bounds up to constant factors. We resolve an open question by demonstrating that when SNRΩ(K1/2)\mathrm{SNR} \ge \Omega(K^{1/2}), noisy-case complexity aligns with the noiseless case. Our analysis leverages sample compression techniques (Ashtiani et al., 2018) and introduces a novel Fourier-based method for recovering distributions from noisy observations, potentially applicable beyond simplex learning.

View on arXiv
@article{saberi2025_2506.10101,
  title={ Fundamental Limits of Learning High-dimensional Simplices in Noisy Regimes },
  author={ Seyed Amir Hossein Saberi and Amir Najafi and Abolfazl Motahari and Babak H. khalaj },
  journal={arXiv preprint arXiv:2506.10101},
  year={ 2025 }
}
Comments on this paper