ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2406.18001
  4. Cited By
Scalable Dual Coordinate Descent for Kernel Methods

Scalable Dual Coordinate Descent for Kernel Methods

26 June 2024
Zishan Shao
Aditya Devarakonda
ArXivPDFHTML

Papers citing "Scalable Dual Coordinate Descent for Kernel Methods"

1 / 1 papers shown
Title
Communication-Efficient, 2D Parallel Stochastic Gradient Descent for Distributed-Memory Optimization
Communication-Efficient, 2D Parallel Stochastic Gradient Descent for Distributed-Memory Optimization
Aditya Devarakonda
Ramakrishnan Kannan
FedML
37
0
0
13 Jan 2025
1