ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2006.04340
  4. Cited By
The Strength of Nesterov's Extrapolation in the Individual Convergence
  of Nonsmooth Optimization

The Strength of Nesterov's Extrapolation in the Individual Convergence of Nonsmooth Optimization

8 June 2020
Wei Tao
Zhisong Pan
Gao-wei Wu
Qing Tao
ArXivPDFHTML

Papers citing "The Strength of Nesterov's Extrapolation in the Individual Convergence of Nonsmooth Optimization"

5 / 5 papers shown
Title
Adapting Step-size: A Unified Perspective to Analyze and Improve
  Gradient-based Methods for Adversarial Attacks
Adapting Step-size: A Unified Perspective to Analyze and Improve Gradient-based Methods for Adversarial Attacks
Wei Tao
Lei Bao
Long Sheng
Gao-wei Wu
Qing Tao
AAML
27
1
0
27 Jan 2023
Minimax risk classifiers with 0-1 loss
Minimax risk classifiers with 0-1 loss
Santiago Mazuelas
Mauricio Romero
Peter Grünwald
39
6
0
17 Jan 2022
Distributed stochastic proximal algorithm with random reshuffling for
  non-smooth finite-sum optimization
Distributed stochastic proximal algorithm with random reshuffling for non-smooth finite-sum optimization
Xia Jiang
Xianlin Zeng
Jian Sun
Jie Chen
Lihua Xie
30
6
0
06 Nov 2021
A simpler approach to obtaining an O(1/t) convergence rate for the
  projected stochastic subgradient method
A simpler approach to obtaining an O(1/t) convergence rate for the projected stochastic subgradient method
Simon Lacoste-Julien
Mark Schmidt
Francis R. Bach
128
259
0
10 Dec 2012
Stochastic Gradient Descent for Non-smooth Optimization: Convergence
  Results and Optimal Averaging Schemes
Stochastic Gradient Descent for Non-smooth Optimization: Convergence Results and Optimal Averaging Schemes
Ohad Shamir
Tong Zhang
104
572
0
08 Dec 2012
1