ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1507.04717
  4. Cited By
Less is More: Nyström Computational Regularization

Less is More: Nyström Computational Regularization

16 July 2015
Alessandro Rudi
Raffaello Camoriano
Lorenzo Rosasco
ArXivPDFHTML

Papers citing "Less is More: Nyström Computational Regularization"

7 / 7 papers shown
Title
Random Forest Autoencoders for Guided Representation Learning
Random Forest Autoencoders for Guided Representation Learning
Adrien Aumon
Shuang Ni
Myriam Lizotte
Guy Wolf
Kevin R. Moon
Jake S. Rhodes
101
0
0
18 Feb 2025
A Bound on the Maximal Marginal Degrees of Freedom
A Bound on the Maximal Marginal Degrees of Freedom
Paul Dommel
105
1
0
20 Feb 2024
Nonlinear Meta-Learning Can Guarantee Faster Rates
Nonlinear Meta-Learning Can Guarantee Faster Rates
Dimitri Meunier
Zhu Li
Arthur Gretton
Samory Kpotufe
66
6
0
20 Jul 2023
Kernel-Based Distributed Q-Learning: A Scalable Reinforcement Learning Approach for Dynamic Treatment Regimes
Kernel-Based Distributed Q-Learning: A Scalable Reinforcement Learning Approach for Dynamic Treatment Regimes
Di Wang
Yao Wang
Shaojie Tang
OffRL
50
1
0
21 Feb 2023
Quasi-Monte Carlo Feature Maps for Shift-Invariant Kernels
Quasi-Monte Carlo Feature Maps for Shift-Invariant Kernels
H. Avron
Vikas Sindhwani
Jiyan Yang
Michael W. Mahoney
54
166
0
29 Dec 2014
Revisiting the Nystrom Method for Improved Large-Scale Machine Learning
Revisiting the Nystrom Method for Improved Large-Scale Machine Learning
Alex Gittens
Michael W. Mahoney
91
414
0
07 Mar 2013
On Some Extensions of Bernstein's Inequality for Self-adjoint Operators
On Some Extensions of Bernstein's Inequality for Self-adjoint Operators
Stanislav Minsker
85
151
0
22 Dec 2011
1