ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1503.00623
  4. Cited By
Unregularized Online Learning Algorithms with General Loss Functions

Unregularized Online Learning Algorithms with General Loss Functions

2 March 2015
Yiming Ying
Ding-Xuan Zhou
ArXivPDFHTML

Papers citing "Unregularized Online Learning Algorithms with General Loss Functions"

8 / 8 papers shown
Title
Differentially Private Stochastic Gradient Descent with Low-Noise
Differentially Private Stochastic Gradient Descent with Low-Noise
Puyu Wang
Yunwen Lei
Yiming Ying
Ding-Xuan Zhou
FedML
40
5
0
09 Sep 2022
Sharper Utility Bounds for Differentially Private Models
Sharper Utility Bounds for Differentially Private Models
Yilin Kang
Yong Liu
Jian Li
Weiping Wang
FedML
26
3
0
22 Apr 2022
Stability and Generalization for Randomized Coordinate Descent
Stability and Generalization for Randomized Coordinate Descent
Puyu Wang
Liang Wu
Yunwen Lei
13
7
0
17 Aug 2021
Improved Learning Rates for Stochastic Optimization: Two Theoretical
  Viewpoints
Improved Learning Rates for Stochastic Optimization: Two Theoretical Viewpoints
Shaojie Li
Yong Liu
20
13
0
19 Jul 2021
Stochastic Iterative Hard Thresholding for Graph-structured Sparsity
  Optimization
Stochastic Iterative Hard Thresholding for Graph-structured Sparsity Optimization
Baojian Zhou
F. Chen
Yiming Ying
21
7
0
09 May 2019
Convergence of Online Mirror Descent
Convergence of Online Mirror Descent
Yunwen Lei
Ding-Xuan Zhou
23
20
0
18 Feb 2018
Convergence of Unregularized Online Learning Algorithms
Convergence of Unregularized Online Learning Algorithms
Yunwen Lei
Lei Shi
Zheng-Chu Guo
11
14
0
09 Aug 2017
Learning Theory Approach to Minimum Error Entropy Criterion
Learning Theory Approach to Minimum Error Entropy Criterion
Ting Hu
Jun Fan
Qiang Wu
Ding-Xuan Zhou
35
82
0
03 Aug 2012
1