ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1704.00196
  4. Cited By
Faster Subgradient Methods for Functions with Hölderian Growth

Faster Subgradient Methods for Functions with Hölderian Growth

1 April 2017
Patrick R. Johnstone
P. Moulin
ArXivPDFHTML

Papers citing "Faster Subgradient Methods for Functions with Hölderian Growth"

6 / 6 papers shown
Title
Some Primal-Dual Theory for Subgradient Methods for Strongly Convex Optimization
Some Primal-Dual Theory for Subgradient Methods for Strongly Convex Optimization
Benjamin Grimmer
Danlin Li
77
6
0
31 Dec 2024
Projective Splitting with Forward Steps: Asynchronous and
  Block-Iterative Operator Splitting
Projective Splitting with Forward Steps: Asynchronous and Block-Iterative Operator Splitting
Patrick R. Johnstone
Jonathan Eckstein
40
40
0
19 Mar 2018
Linear Convergence of Gradient and Proximal-Gradient Methods Under the
  Polyak-Łojasiewicz Condition
Linear Convergence of Gradient and Proximal-Gradient Methods Under the Polyak-Łojasiewicz Condition
Hamed Karimi
J. Nutini
Mark Schmidt
275
1,218
0
16 Aug 2016
A Unified Approach to Error Bounds for Structured Convex Optimization
  Problems
A Unified Approach to Error Bounds for Structured Convex Optimization Problems
Zirui Zhou
Anthony Man-Cho So
42
184
0
11 Dec 2015
RSG: Beating Subgradient Method without Smoothness and Strong Convexity
RSG: Beating Subgradient Method without Smoothness and Strong Convexity
Tianbao Yang
Qihang Lin
62
84
0
09 Dec 2015
Learning with Submodular Functions: A Convex Optimization Perspective
Learning with Submodular Functions: A Convex Optimization Perspective
Francis R. Bach
130
478
0
28 Nov 2011
1