ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2412.02639
71
1

The Space Complexity of Approximating Logistic Loss

3 December 2024
Gregory Dexter
P. Drineas
Rajiv Khanna
ArXivPDFHTML
Abstract

We provide space complexity lower bounds for data structures that approximate logistic loss up to ϵ\epsilonϵ-relative error on a logistic regression problem with data X∈Rn×d\mathbf{X} \in \mathbb{R}^{n \times d}X∈Rn×d and labels y∈{−1,1}d\mathbf{y} \in \{-1,1\}^dy∈{−1,1}d. The space complexity of existing coreset constructions depend on a natural complexity measure μy(X)\mu_\mathbf{y}(\mathbf{X})μy​(X), first defined in (Munteanu, 2018). We give an Ω~(dϵ2)\tilde{\Omega}(\frac{d}{\epsilon^2})Ω~(ϵ2d​) space complexity lower bound in the regime μy(X)=O(1)\mu_\mathbf{y}(\mathbf{X}) = O(1)μy​(X)=O(1) that shows existing coresets are optimal in this regime up to lower order factors. We also prove a general Ω~(d⋅μy(X))\tilde{\Omega}(d\cdot \mu_\mathbf{y}(\mathbf{X}))Ω~(d⋅μy​(X)) space lower bound when ϵ\epsilonϵ is constant, showing that the dependency on μy(X)\mu_\mathbf{y}(\mathbf{X})μy​(X) is not an artifact of mergeable coresets. Finally, we refute a prior conjecture that μy(X)\mu_\mathbf{y}(\mathbf{X})μy​(X) is hard to compute by providing an efficient linear programming formulation, and we empirically compare our algorithm to prior approximate methods.

View on arXiv
Comments on this paper