ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2012.07054
  4. Cited By
Adaptive and Oblivious Randomized Subspace Methods for High-Dimensional
  Optimization: Sharp Analysis and Lower Bounds

Adaptive and Oblivious Randomized Subspace Methods for High-Dimensional Optimization: Sharp Analysis and Lower Bounds

13 December 2020
Jonathan Lacotte
Mert Pilanci
ArXivPDFHTML

Papers citing "Adaptive and Oblivious Randomized Subspace Methods for High-Dimensional Optimization: Sharp Analysis and Lower Bounds"

6 / 6 papers shown
Title
When Do Neural Networks Outperform Kernel Methods?
When Do Neural Networks Outperform Kernel Methods?
Behrooz Ghorbani
Song Mei
Theodor Misiakiewicz
Andrea Montanari
74
188
0
24 Jun 2020
Neural Networks are Convex Regularizers: Exact Polynomial-time Convex
  Optimization Formulations for Two-layer Networks
Neural Networks are Convex Regularizers: Exact Polynomial-time Convex Optimization Formulations for Two-layer Networks
Mert Pilanci
Tolga Ergen
59
118
0
24 Feb 2020
Revealing the Structure of Deep Neural Networks via Convex Duality
Revealing the Structure of Deep Neural Networks via Convex Duality
Tolga Ergen
Mert Pilanci
MLT
36
71
0
22 Feb 2020
Optimal Randomized First-Order Methods for Least-Squares Problems
Optimal Randomized First-Order Methods for Least-Squares Problems
Jonathan Lacotte
Mert Pilanci
36
30
0
21 Feb 2020
Reverse iterative volume sampling for linear regression
Reverse iterative volume sampling for linear regression
Michal Derezinski
Manfred K. Warmuth
63
43
0
06 Jun 2018
How transferable are features in deep neural networks?
How transferable are features in deep neural networks?
J. Yosinski
Jeff Clune
Yoshua Bengio
Hod Lipson
OOD
145
8,309
0
06 Nov 2014
1