ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2106.08700
  4. Cited By
Topology Distillation for Recommender System

Topology Distillation for Recommender System

16 June 2021
SeongKu Kang
Junyoung Hwang
Wonbin Kweon
Hwanjo Yu
ArXivPDFHTML

Papers citing "Topology Distillation for Recommender System"

11 / 11 papers shown
Title
Spatial-Temporal Knowledge Distillation for Takeaway Recommendation
Spatial-Temporal Knowledge Distillation for Takeaway Recommendation
Shuyuan Zhao
Wei Chen
Boyan Shi
Liyong Zhou
Shuohao Lin
Huaiyu Wan
99
0
0
21 Dec 2024
Exploring Feature-based Knowledge Distillation for Recommender System: A Frequency Perspective
Exploring Feature-based Knowledge Distillation for Recommender System: A Frequency Perspective
Zhangchi Zhu
Wei Zhang
50
0
0
16 Nov 2024
Retrieval and Distill: A Temporal Data Shift-Free Paradigm for Online
  Recommendation System
Retrieval and Distill: A Temporal Data Shift-Free Paradigm for Online Recommendation System
Lei Zheng
Ning Li
Weinan Zhang
Yong Yu
AI4TS
46
0
0
24 Apr 2024
MvFS: Multi-view Feature Selection for Recommender System
MvFS: Multi-view Feature Selection for Recommender System
Youngjune Lee
Yeongjong Jeong
Keunchan Park
SeongKu Kang
38
12
0
05 Sep 2023
Teacher-Student Architecture for Knowledge Distillation: A Survey
Teacher-Student Architecture for Knowledge Distillation: A Survey
Chengming Hu
Xuan Li
Danyang Liu
Haolun Wu
Xi Chen
Ju Wang
Xue Liu
26
16
0
08 Aug 2023
Distillation from Heterogeneous Models for Top-K Recommendation
Distillation from Heterogeneous Models for Top-K Recommendation
SeongKu Kang
Wonbin Kweon
Dongha Lee
Jianxun Lian
Xing Xie
Hwanjo Yu
VLM
40
21
0
02 Mar 2023
Unbiased Knowledge Distillation for Recommendation
Unbiased Knowledge Distillation for Recommendation
Gang Chen
Jiawei Chen
Fuli Feng
Sheng Zhou
Xiangnan He
37
27
0
27 Nov 2022
Linkless Link Prediction via Relational Distillation
Linkless Link Prediction via Relational Distillation
Zhichun Guo
William Shiao
Shichang Zhang
Yozen Liu
Nitesh Chawla
Neil Shah
Tong Zhao
32
41
0
11 Oct 2022
Cooperative Retriever and Ranker in Deep Recommenders
Cooperative Retriever and Ranker in Deep Recommenders
Xunpeng Huang
Defu Lian
Jin Chen
Liu Zheng
Xing Xie
Enhong Chen
VLM
AI4TS
30
11
0
28 Jun 2022
Consensus Learning from Heterogeneous Objectives for One-Class
  Collaborative Filtering
Consensus Learning from Heterogeneous Objectives for One-Class Collaborative Filtering
SeongKu Kang
Dongha Lee
Wonbin Kweon
Junyoung Hwang
Hwanjo Yu
21
12
0
26 Feb 2022
Dual Correction Strategy for Ranking Distillation in Top-N Recommender
  System
Dual Correction Strategy for Ranking Distillation in Top-N Recommender System
Youngjune Lee
Kee-Eung Kim
22
19
0
08 Sep 2021
1