ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1803.03466
  4. Cited By
A Stochastic Semismooth Newton Method for Nonsmooth Nonconvex
  Optimization

A Stochastic Semismooth Newton Method for Nonsmooth Nonconvex Optimization

9 March 2018
Andre Milzarek
X. Xiao
Shicong Cen
Zaiwen Wen
M. Ulbrich
ArXivPDFHTML

Papers citing "A Stochastic Semismooth Newton Method for Nonsmooth Nonconvex Optimization"

2 / 2 papers shown
Title
Explicit Second-Order Min-Max Optimization Methods with Optimal
  Convergence Guarantee
Explicit Second-Order Min-Max Optimization Methods with Optimal Convergence Guarantee
Tianyi Lin
P. Mertikopoulos
Michael I. Jordan
26
11
0
23 Oct 2022
A Proximal Stochastic Gradient Method with Progressive Variance
  Reduction
A Proximal Stochastic Gradient Method with Progressive Variance Reduction
Lin Xiao
Tong Zhang
ODL
93
737
0
19 Mar 2014
1