Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2407.14768
Cited By
Teach Harder, Learn Poorer: Rethinking Hard Sample Distillation for GNN-to-MLP Knowledge Distillation
20 July 2024
Lirong Wu
Yunfan Liu
Haitao Lin
Yufei Huang
Stan Z. Li
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Teach Harder, Learn Poorer: Rethinking Hard Sample Distillation for GNN-to-MLP Knowledge Distillation"
3 / 3 papers shown
Title
Heuristic Methods are Good Teachers to Distill MLPs for Graph Link Prediction
Zongyue Qin
Shichang Zhang
Mingxuan Ju
Tong Zhao
Neil Shah
Yizhou Sun
26
0
0
08 Apr 2025
Iterative Graph Self-Distillation
Hanlin Zhang
Shuai Lin
Weiyang Liu
Pan Zhou
Jian Tang
Xiaodan Liang
Eric P. Xing
SSL
57
33
0
23 Oct 2020
Distilling Knowledge from Graph Convolutional Networks
Yiding Yang
Jiayan Qiu
Xiuming Zhang
Dacheng Tao
Xinchao Wang
160
226
0
23 Mar 2020
1