ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1602.01690
  4. Cited By
Minimizing the Maximal Loss: How and Why?

Minimizing the Maximal Loss: How and Why?

4 February 2016
Shai Shalev-Shwartz
Y. Wexler
ArXivPDFHTML

Papers citing "Minimizing the Maximal Loss: How and Why?"

5 / 5 papers shown
Title
Nonsmooth Nonconvex-Nonconcave Minimax Optimization: Primal-Dual Balancing and Iteration Complexity Analysis
Nonsmooth Nonconvex-Nonconcave Minimax Optimization: Primal-Dual Balancing and Iteration Complexity Analysis
Jiajin Li
Lingling Zhu
Anthony Man-Cho So
85
4
0
17 Jan 2025
Uncertainty-Aware Robust Learning on Noisy Graphs
Uncertainty-Aware Robust Learning on Noisy Graphs
Shuyi Chen
Kaize Ding
Shixiang Zhu
64
5
0
14 Jun 2023
Adaptive Sampling for Stochastic Risk-Averse Learning
Adaptive Sampling for Stochastic Risk-Averse Learning
Sebastian Curi
Kfir Y. Levy
Stefanie Jegelka
Andreas Krause
60
53
0
28 Oct 2019
Even Faster Accelerated Coordinate Descent Using Non-Uniform Sampling
Even Faster Accelerated Coordinate Descent Using Non-Uniform Sampling
Zeyuan Allen-Zhu
Zheng Qu
Peter Richtárik
Yang Yuan
64
171
0
30 Dec 2015
Sublinear Optimization for Machine Learning
Sublinear Optimization for Machine Learning
K. Clarkson
Elad Hazan
David P. Woodruff
52
138
0
21 Oct 2010
1