Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2009.05912
Cited By
DualDE: Dually Distilling Knowledge Graph Embedding for Faster and Cheaper Reasoning
13 September 2020
Yushan Zhu
Wen Zhang
Mingyang Chen
Hui Chen
Xu-Xin Cheng
Wei Zhang
Huajun Chen Zhejiang University
Re-assign community
ArXiv
PDF
HTML
Papers citing
"DualDE: Dually Distilling Knowledge Graph Embedding for Faster and Cheaper Reasoning"
6 / 6 papers shown
Title
Progressive Distillation Based on Masked Generation Feature Method for Knowledge Graph Completion
Cunhang Fan
Yujie Chen
Jun Xue
Yonghui Kong
Jianhua Tao
Zhao Lv
28
2
0
19 Jan 2024
Random Entity Quantization for Parameter-Efficient Compositional Knowledge Graph Representation
Jiaang Li
Quan Wang
Yi Liu
L. Zhang
Zhendong Mao
29
0
0
24 Oct 2023
From Wide to Deep: Dimension Lifting Network for Parameter-efficient Knowledge Graph Embedding
Borui Cai
Yong Xiang
Longxiang Gao
Di Wu
Heng Zhang
Jiongdao Jin
Tom H. Luan
29
1
0
22 Mar 2023
Graph-based Knowledge Distillation: A survey and experimental evaluation
Jing Liu
Tongya Zheng
Guanzheng Zhang
Qinfen Hao
33
8
0
27 Feb 2023
A Survey of Knowledge Graph Reasoning on Graph Types: Static, Dynamic, and Multimodal
K. Liang
Lingyuan Meng
Meng Liu
Yue Liu
Wenxuan Tu
Siwei Wang
Sihang Zhou
Xinwang Liu
Fu Sun
LRM
26
108
0
12 Dec 2022
Confidence-aware Self-Semantic Distillation on Knowledge Graph Embedding
Yichen Liu
C. Wang
Defang Chen
Zhehui Zhou
Yan Feng
Chun-Yen Chen
19
0
0
07 Jun 2022
1