ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1911.05700
  4. Cited By
Graph Representation Learning via Multi-task Knowledge Distillation

Graph Representation Learning via Multi-task Knowledge Distillation

11 November 2019
Jiaqi Ma
Qiaozhu Mei
ArXivPDFHTML

Papers citing "Graph Representation Learning via Multi-task Knowledge Distillation"

8 / 8 papers shown
Title
OLGA: One-cLass Graph Autoencoder
OLGA: One-cLass Graph Autoencoder
M. Gôlo
José Gilberto Barbosa de Medeiros Junior
Diego Furtado Silva
R. Marcacini
39
0
0
13 Jun 2024
Graph-based Knowledge Distillation: A survey and experimental evaluation
Graph-based Knowledge Distillation: A survey and experimental evaluation
Jing Liu
Tongya Zheng
Guanzheng Zhang
Qinfen Hao
35
8
0
27 Feb 2023
Linkless Link Prediction via Relational Distillation
Linkless Link Prediction via Relational Distillation
Zhichun Guo
William Shiao
Shichang Zhang
Yozen Liu
Nitesh Chawla
Neil Shah
Tong Zhao
32
41
0
11 Oct 2022
Universal Representations: A Unified Look at Multiple Task and Domain
  Learning
Universal Representations: A Unified Look at Multiple Task and Domain Learning
Wei-Hong Li
Xialei Liu
Hakan Bilen
SSL
OOD
30
27
0
06 Apr 2022
Adaptive Distillation: Aggregating Knowledge from Multiple Paths for
  Efficient Distillation
Adaptive Distillation: Aggregating Knowledge from Multiple Paths for Efficient Distillation
Sumanth Chennupati
Mohammad Mahdi Kamani
Zhongwei Cheng
Lin Chen
32
4
0
19 Oct 2021
Universal Representation Learning from Multiple Domains for Few-shot
  Classification
Universal Representation Learning from Multiple Domains for Few-shot Classification
Weihong Li
Xialei Liu
Hakan Bilen
SSL
OOD
VLM
30
84
0
25 Mar 2021
Compacting Deep Neural Networks for Internet of Things: Methods and
  Applications
Compacting Deep Neural Networks for Internet of Things: Methods and Applications
Ke Zhang
Hanbo Ying
Hongning Dai
Lin Li
Yuangyuang Peng
Keyi Guo
Hongfang Yu
21
38
0
20 Mar 2021
Knowledge Distillation: A Survey
Knowledge Distillation: A Survey
Jianping Gou
B. Yu
Stephen J. Maybank
Dacheng Tao
VLM
23
2,857
0
09 Jun 2020
1