ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1801.07226
  4. Cited By
Optimal Convergence for Distributed Learning with Stochastic Gradient
  Methods and Spectral Algorithms

Optimal Convergence for Distributed Learning with Stochastic Gradient Methods and Spectral Algorithms

22 January 2018
Junhong Lin
V. Cevher
ArXivPDFHTML

Papers citing "Optimal Convergence for Distributed Learning with Stochastic Gradient Methods and Spectral Algorithms"

7 / 7 papers shown
Title
On the Impacts of the Random Initialization in the Neural Tangent Kernel
  Theory
On the Impacts of the Random Initialization in the Neural Tangent Kernel Theory
Guhan Chen
Yicheng Li
Qian Lin
AAML
38
1
0
08 Oct 2024
Optimal Kernel Quantile Learning with Random Features
Optimal Kernel Quantile Learning with Random Features
Caixing Wang
Xingdong Feng
57
0
0
24 Aug 2024
Spectral Algorithms on Manifolds through Diffusion
Spectral Algorithms on Manifolds through Diffusion
Weichun Xia
Lei Shi
16
1
0
06 Mar 2024
On the Optimality of Misspecified Spectral Algorithms
On the Optimality of Misspecified Spectral Algorithms
Hao Zhang
Yicheng Li
Qian Lin
18
15
0
27 Mar 2023
Statistical Optimality of Divide and Conquer Kernel-based Functional
  Linear Regression
Statistical Optimality of Divide and Conquer Kernel-based Functional Linear Regression
Jiading Liu
Lei Shi
30
9
0
20 Nov 2022
Sobolev Norm Learning Rates for Conditional Mean Embeddings
Sobolev Norm Learning Rates for Conditional Mean Embeddings
Prem M. Talwai
A. Shameli
D. Simchi-Levi
24
10
0
16 May 2021
Sobolev Norm Learning Rates for Regularized Least-Squares Algorithm
Sobolev Norm Learning Rates for Regularized Least-Squares Algorithm
Simon Fischer
Ingo Steinwart
15
148
0
23 Feb 2017
1