ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2310.20581
  4. Cited By
Stochastic Gradient Descent for Gaussian Processes Done Right
v1v2 (latest)

Stochastic Gradient Descent for Gaussian Processes Done Right

31 October 2023
J. Lin
Shreyas Padhy
Javier Antorán
Austin Tripp
Alexander Terenin
Csaba Szepesvári
José Miguel Hernández-Lobato
David Janz
ArXiv (abs)PDFHTML

Papers citing "Stochastic Gradient Descent for Gaussian Processes Done Right"

14 / 14 papers shown
Title
Joker: Joint Optimization Framework for Lightweight Kernel Machines
Joker: Joint Optimization Framework for Lightweight Kernel Machines
Junhong Zhang
Zhihui Lai
48
0
0
23 May 2025
Improving Linear System Solvers for Hyperparameter Optimisation in Iterative Gaussian Processes
Improving Linear System Solvers for Hyperparameter Optimisation in Iterative Gaussian Processes
J. Lin
Shreyas Padhy
Bruno Mlodozeniec
Javier Antorán
José Miguel Hernández-Lobato
69
3
0
28 May 2024
Tanimoto Random Features for Scalable Molecular Machine Learning
Tanimoto Random Features for Scalable Molecular Machine Learning
Austin Tripp
S. Bacallado
Sukriti Singh
José Miguel Hernández-Lobato
48
8
0
26 Jun 2023
Sampling from Gaussian Process Posteriors using Stochastic Gradient
  Descent
Sampling from Gaussian Process Posteriors using Stochastic Gradient Descent
J. Lin
Javier Antorán
Shreyas Padhy
David Janz
José Miguel Hernández-Lobato
Alexander Terenin
77
25
0
20 Jun 2023
Sampling-based inference for large linear models, with application to
  linearised Laplace
Sampling-based inference for large linear models, with application to linearised Laplace
Javier Antorán
Shreyas Padhy
Riccardo Barbano
Eric T. Nalisnick
David Janz
José Miguel Hernández-Lobato
BDL
40
17
0
10 Oct 2022
DOCKSTRING: easy molecular docking yields better benchmarks for ligand
  design
DOCKSTRING: easy molecular docking yields better benchmarks for ligand design
Miguel García-Ortegón
G. Simm
Austin Tripp
José Miguel Hernández-Lobato
A. Bender
S. Bacallado
86
82
0
29 Oct 2021
Efficiently Sampling Functions from Gaussian Process Posteriors
Efficiently Sampling Functions from Gaussian Process Posteriors
James T. Wilson
Viacheslav Borovitskiy
Alexander Terenin
P. Mostowsky
M. Deisenroth
56
165
0
21 Feb 2020
Exact Gaussian Processes on a Million Data Points
Exact Gaussian Processes on a Million Data Points
Ke Alexander Wang
Geoff Pleiss
Jacob R. Gardner
Stephen Tyree
Kilian Q. Weinberger
A. Wilson
GP
55
230
0
19 Mar 2019
Parallel and Distributed Thompson Sampling for Large-scale Accelerated
  Exploration of Chemical Space
Parallel and Distributed Thompson Sampling for Large-scale Accelerated Exploration of Chemical Space
José Miguel Hernández-Lobato
James Requeima
Edward O. Pyzer-Knapp
Alán Aspuru-Guzik
76
183
0
06 Jun 2017
FALKON: An Optimal Large Scale Kernel Method
FALKON: An Optimal Large Scale Kernel Method
Alessandro Rudi
Luigi Carratino
Lorenzo Rosasco
78
196
0
31 May 2017
Neural Message Passing for Quantum Chemistry
Neural Message Passing for Quantum Chemistry
Justin Gilmer
S. Schoenholz
Patrick F. Riley
Oriol Vinyals
George E. Dahl
593
7,455
0
04 Apr 2017
Gaussian Processes for Big Data
Gaussian Processes for Big Data
J. Hensman
Nicolò Fusi
Neil D. Lawrence
GP
107
1,232
0
26 Sep 2013
Stochastic Dual Coordinate Ascent Methods for Regularized Loss
  Minimization
Stochastic Dual Coordinate Ascent Methods for Regularized Loss Minimization
Shai Shalev-Shwartz
Tong Zhang
184
1,033
0
10 Sep 2012
Greedy Block Coordinate Descent for Large Scale Gaussian Process
  Regression
Greedy Block Coordinate Descent for Large Scale Gaussian Process Regression
Liefeng Bo
C. Sminchisescu
55
25
0
13 Jun 2012
1