Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1911.05700
Cited By
Graph Representation Learning via Multi-task Knowledge Distillation
11 November 2019
Jiaqi Ma
Qiaozhu Mei
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Graph Representation Learning via Multi-task Knowledge Distillation"
8 / 8 papers shown
Title
OLGA: One-cLass Graph Autoencoder
M. Gôlo
José Gilberto Barbosa de Medeiros Junior
Diego Furtado Silva
R. Marcacini
39
0
0
13 Jun 2024
Graph-based Knowledge Distillation: A survey and experimental evaluation
Jing Liu
Tongya Zheng
Guanzheng Zhang
Qinfen Hao
35
8
0
27 Feb 2023
Linkless Link Prediction via Relational Distillation
Zhichun Guo
William Shiao
Shichang Zhang
Yozen Liu
Nitesh Chawla
Neil Shah
Tong Zhao
32
41
0
11 Oct 2022
Universal Representations: A Unified Look at Multiple Task and Domain Learning
Wei-Hong Li
Xialei Liu
Hakan Bilen
SSL
OOD
30
27
0
06 Apr 2022
Adaptive Distillation: Aggregating Knowledge from Multiple Paths for Efficient Distillation
Sumanth Chennupati
Mohammad Mahdi Kamani
Zhongwei Cheng
Lin Chen
32
4
0
19 Oct 2021
Universal Representation Learning from Multiple Domains for Few-shot Classification
Weihong Li
Xialei Liu
Hakan Bilen
SSL
OOD
VLM
30
84
0
25 Mar 2021
Compacting Deep Neural Networks for Internet of Things: Methods and Applications
Ke Zhang
Hanbo Ying
Hongning Dai
Lin Li
Yuangyuang Peng
Keyi Guo
Hongfang Yu
21
38
0
20 Mar 2021
Knowledge Distillation: A Survey
Jianping Gou
B. Yu
Stephen J. Maybank
Dacheng Tao
VLM
23
2,857
0
09 Jun 2020
1