ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2306.00732
25
0

Sharper Bounds for ℓp\ell_pℓp​ Sensitivity Sampling

1 June 2023
David P. Woodruff
T. Yasuda
ArXivPDFHTML
Abstract

In large scale machine learning, random sampling is a popular way to approximate datasets by a small representative subset of examples. In particular, sensitivity sampling is an intensely studied technique which provides provable guarantees on the quality of approximation, while reducing the number of examples to the product of the VC dimension ddd and the total sensitivity S\mathfrak SS in remarkably general settings. However, guarantees going beyond this general bound of Sd\mathfrak S dSd are known in perhaps only one setting, for ℓ2\ell_2ℓ2​ subspace embeddings, despite intense study of sensitivity sampling in prior work. In this work, we show the first bounds for sensitivity sampling for ℓp\ell_pℓp​ subspace embeddings for p>2p > 2p>2 that improve over the general Sd\mathfrak S dSd bound, achieving a bound of roughly S2−2/p\mathfrak S^{2-2/p}S2−2/p for 2<p<∞2<p<\infty2<p<∞. Furthermore, our techniques yield further new results in the study of sampling algorithms, showing that the root leverage score sampling algorithm achieves a bound of roughly ddd for 1≤p<21\leq p<21≤p<2, and that a combination of leverage score and sensitivity sampling achieves an improved bound of roughly d2/pS2−4/pd^{2/p}\mathfrak S^{2-4/p}d2/pS2−4/p for 2<p<∞2<p<\infty2<p<∞. Our sensitivity sampling results yield the best known sample complexity for a wide class of structured matrices that have small ℓp\ell_pℓp​ sensitivity.

View on arXiv
Comments on this paper