ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1906.01974
  4. Cited By
Willump: A Statistically-Aware End-to-end Optimizer for Machine Learning
  Inference

Willump: A Statistically-Aware End-to-end Optimizer for Machine Learning Inference

3 June 2019
Peter Kraft
Daniel Kang
Deepak Narayanan
Shoumik Palkar
Peter Bailis
Matei A. Zaharia
ArXivPDFHTML

Papers citing "Willump: A Statistically-Aware End-to-end Optimizer for Machine Learning Inference"

4 / 4 papers shown
Title
Efficient Multi-stage Inference on Tabular Data
Efficient Multi-stage Inference on Tabular Data
Daniel S Johnson
Igor L. Markov
35
0
0
21 Mar 2023
Accelerating Deep Learning Inference via Learned Caches
Accelerating Deep Learning Inference via Learned Caches
Arjun Balasubramanian
Adarsh Kumar
Yuhan Liu
Han Cao
Shivaram Venkataraman
Aditya Akella
28
18
0
18 Jan 2021
A Tensor Compiler for Unified Machine Learning Prediction Serving
A Tensor Compiler for Unified Machine Learning Prediction Serving
Supun Nakandala Karla Saur
Karla Saur
Gyeong-In Yu
Konstantinos Karanasos
Carlo Curino
Markus Weimer
Matteo Interlandi
27
53
0
09 Oct 2020
Extending Relational Query Processing with ML Inference
Extending Relational Query Processing with ML Inference
Konstantinos Karanasos
Matteo Interlandi
Doris Xin
Fotis Psallidas
Rathijit Sen
...
Subru Krishnan
Markus Weimer
Yuan Yu
R. Ramakrishnan
Carlo Curino
8
61
0
01 Nov 2019
1