ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1508.05133
  4. Cited By
Steps Toward Deep Kernel Methods from Infinite Neural Networks

Steps Toward Deep Kernel Methods from Infinite Neural Networks

20 August 2015
Tamir Hazan
Tommi Jaakkola
ArXivPDFHTML

Papers citing "Steps Toward Deep Kernel Methods from Infinite Neural Networks"

6 / 6 papers shown
Title
Deep Horseshoe Gaussian Processes
Deep Horseshoe Gaussian Processes
Ismael Castillo
Thibault Randrianarisoa
BDL
UQCV
70
5
0
04 Mar 2024
Connecting NTK and NNGP: A Unified Theoretical Framework for Wide Neural Network Learning Dynamics
Connecting NTK and NNGP: A Unified Theoretical Framework for Wide Neural Network Learning Dynamics
Yehonatan Avidan
Qianyi Li
H. Sompolinsky
85
8
0
08 Sep 2023
Tensor Programs I: Wide Feedforward or Recurrent Neural Networks of Any
  Architecture are Gaussian Processes
Tensor Programs I: Wide Feedforward or Recurrent Neural Networks of Any Architecture are Gaussian Processes
Greg Yang
80
197
0
28 Oct 2019
How to Scale Up Kernel Methods to Be As Good As Deep Neural Nets
How to Scale Up Kernel Methods to Be As Good As Deep Neural Nets
Zhiyun Lu
Avner May
Kuan Liu
A. Garakani
Dong Guo
...
Linxi Fan
Michael Collins
Brian Kingsbury
M. Picheny
Fei Sha
BDL
93
66
0
14 Nov 2014
Fastfood: Approximate Kernel Expansions in Loglinear Time
Fastfood: Approximate Kernel Expansions in Loglinear Time
Quoc V. Le
Tamás Sarlós
Alex Smola
71
442
0
13 Aug 2014
Dropout Training as Adaptive Regularization
Dropout Training as Adaptive Regularization
Stefan Wager
Sida I. Wang
Percy Liang
106
597
0
04 Jul 2013
1