ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2002.08404
18
82

Implicit Regularization of Random Feature Models

19 February 2020
Arthur Jacot
Berfin Simsek
Francesco Spadaro
Clément Hongler
Franck Gabriel
ArXivPDFHTML
Abstract

Random Feature (RF) models are used as efficient parametric approximations of kernel methods. We investigate, by means of random matrix theory, the connection between Gaussian RF models and Kernel Ridge Regression (KRR). For a Gaussian RF model with PPP features, NNN data points, and a ridge λ\lambdaλ, we show that the average (i.e. expected) RF predictor is close to a KRR predictor with an effective ridge λ~\tilde{\lambda}λ~. We show that λ~>λ\tilde{\lambda} > \lambdaλ~>λ and λ~↘λ\tilde{\lambda} \searrow \lambdaλ~↘λ monotonically as PPP grows, thus revealing the implicit regularization effect of finite RF sampling. We then compare the risk (i.e. test error) of the λ~\tilde{\lambda}λ~-KRR predictor with the average risk of the λ\lambdaλ-RF predictor and obtain a precise and explicit bound on their difference. Finally, we empirically find an extremely good agreement between the test errors of the average λ\lambdaλ-RF predictor and λ~\tilde{\lambda}λ~-KRR predictor.

View on arXiv
Comments on this paper