ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2401.10484
  4. Cited By
Enhancing Scalability in Recommender Systems through Lottery Ticket
  Hypothesis and Knowledge Distillation-based Neural Network Pruning

Enhancing Scalability in Recommender Systems through Lottery Ticket Hypothesis and Knowledge Distillation-based Neural Network Pruning

19 January 2024
R. Rajaram
Manoj Bharadhwaj
VS Vasan
N. Pervin
ArXivPDFHTML

Papers citing "Enhancing Scalability in Recommender Systems through Lottery Ticket Hypothesis and Knowledge Distillation-based Neural Network Pruning"

2 / 2 papers shown
Title
Show, Attend and Distill:Knowledge Distillation via Attention-based
  Feature Matching
Show, Attend and Distill:Knowledge Distillation via Attention-based Feature Matching
Mingi Ji
Byeongho Heo
Sungrae Park
65
143
0
05 Feb 2021
The Lottery Ticket Hypothesis for Object Recognition
The Lottery Ticket Hypothesis for Object Recognition
Sharath Girish
Shishira R. Maiya
Kamal Gupta
Hao Chen
L. Davis
Abhinav Shrivastava
83
60
0
08 Dec 2020
1