ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1809.05247
  4. Cited By
Revisiting Random Binning Features: Fast Convergence and Strong
  Parallelizability

Revisiting Random Binning Features: Fast Convergence and Strong Parallelizability

14 September 2018
Lingfei Wu
Ian En-Hsu Yen
Jie Chen
Rui Yan
ArXivPDFHTML

Papers citing "Revisiting Random Binning Features: Fast Convergence and Strong Parallelizability"

4 / 4 papers shown
Title
Fastfood: Approximate Kernel Expansions in Loglinear Time
Fastfood: Approximate Kernel Expansions in Loglinear Time
Quoc V. Le
Tamás Sarlós
Alex Smola
74
442
0
13 Aug 2014
Revisiting the Nystrom Method for Improved Large-Scale Machine Learning
Revisiting the Nystrom Method for Improved Large-Scale Machine Learning
Alex Gittens
Michael W. Mahoney
94
414
0
07 Mar 2013
Parallel Coordinate Descent Methods for Big Data Optimization
Parallel Coordinate Descent Methods for Big Data Optimization
Peter Richtárik
Martin Takáč
94
487
0
04 Dec 2012
Parallel Coordinate Descent for L1-Regularized Loss Minimization
Parallel Coordinate Descent for L1-Regularized Loss Minimization
Joseph K. Bradley
Aapo Kyrola
Danny Bickson
Carlos Guestrin
85
309
0
26 May 2011
1