ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1304.1192
  4. Cited By
Efficient Distance Metric Learning by Adaptive Sampling and Mini-Batch
  Stochastic Gradient Descent (SGD)

Efficient Distance Metric Learning by Adaptive Sampling and Mini-Batch Stochastic Gradient Descent (SGD)

3 April 2013
Qi Qian
Rong Jin
Jinfeng Yi
Lijun Zhang
Shenghuo Zhu
ArXivPDFHTML

Papers citing "Efficient Distance Metric Learning by Adaptive Sampling and Mini-Batch Stochastic Gradient Descent (SGD)"

3 / 3 papers shown
Title
Stochastic Optimization of Smooth Loss
Stochastic Optimization of Smooth Loss
Rong Jin
49
1
0
30 Nov 2013
Projection-free Online Learning
Projection-free Online Learning
Elad Hazan
Satyen Kale
100
249
0
18 Jun 2012
Better Mini-Batch Algorithms via Accelerated Gradient Methods
Better Mini-Batch Algorithms via Accelerated Gradient Methods
Andrew Cotter
Ohad Shamir
Nathan Srebro
Karthik Sridharan
ODL
120
314
0
22 Jun 2011
1