ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2206.06531
  4. Cited By
A Stochastic Proximal Method for Nonsmooth Regularized Finite Sum
  Optimization

A Stochastic Proximal Method for Nonsmooth Regularized Finite Sum Optimization

14 June 2022
Dounia Lakhmiri
D. Orban
Andrea Lodi
ArXivPDFHTML

Papers citing "A Stochastic Proximal Method for Nonsmooth Regularized Finite Sum Optimization"

2 / 2 papers shown
Title
Sparsity in Deep Learning: Pruning and growth for efficient inference
  and training in neural networks
Sparsity in Deep Learning: Pruning and growth for efficient inference and training in neural networks
Torsten Hoefler
Dan Alistarh
Tal Ben-Nun
Nikoli Dryden
Alexandra Peste
MQ
156
688
0
31 Jan 2021
Linear Convergence of Gradient and Proximal-Gradient Methods Under the
  Polyak-Łojasiewicz Condition
Linear Convergence of Gradient and Proximal-Gradient Methods Under the Polyak-Łojasiewicz Condition
Hamed Karimi
J. Nutini
Mark Schmidt
139
1,205
0
16 Aug 2016
1