ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2101.04073
  4. Cited By
Deeplite Neutrino: An End-to-End Framework for Constrained Deep Learning
  Model Optimization

Deeplite Neutrino: An End-to-End Framework for Constrained Deep Learning Model Optimization

11 January 2021
A. Sankaran
Olivier Mastropietro
Ehsan Saboori
Yasser Idris
Davis Sawyer
Mohammadhossein Askarihemmat
G. B. Hacene
ArXivPDFHTML

Papers citing "Deeplite Neutrino: An End-to-End Framework for Constrained Deep Learning Model Optimization"

2 / 2 papers shown
Title
Group Sparsity: The Hinge Between Filter Pruning and Decomposition for
  Network Compression
Group Sparsity: The Hinge Between Filter Pruning and Decomposition for Network Compression
Yawei Li
Shuhang Gu
Christoph Mayer
Luc Van Gool
Radu Timofte
137
189
0
19 Mar 2020
Megatron-LM: Training Multi-Billion Parameter Language Models Using
  Model Parallelism
Megatron-LM: Training Multi-Billion Parameter Language Models Using Model Parallelism
M. Shoeybi
M. Patwary
Raul Puri
P. LeGresley
Jared Casper
Bryan Catanzaro
MoE
245
1,826
0
17 Sep 2019
1