Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2009.07531
Cited By
Simplified TinyBERT: Knowledge Distillation for Document Retrieval
16 September 2020
Xuanang Chen
Xianpei Han
Kai Hui
Le Sun
Yingfei Sun
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Simplified TinyBERT: Knowledge Distillation for Document Retrieval"
5 / 5 papers shown
Title
ED2LM: Encoder-Decoder to Language Model for Faster Document Re-ranking Inference
Kai Hui
Honglei Zhuang
Tao Chen
Zhen Qin
Jing Lu
...
Ji Ma
Jai Gupta
Cicero Nogueira dos Santos
Yi Tay
Donald Metzler
34
16
0
25 Apr 2022
Do Lessons from Metric Learning Generalize to Image-Caption Retrieval?
Maurits J. R. Bleeker
Maarten de Rijke
SSL
DML
29
9
0
14 Feb 2022
Multi-Modality Distillation via Learning the teacher's modality-level Gram Matrix
Peng Liu
21
0
0
21 Dec 2021
Efficiently Teaching an Effective Dense Retriever with Balanced Topic Aware Sampling
Sebastian Hofstatter
Sheng-Chieh Lin
Jheng-Hong Yang
Jimmy J. Lin
Allan Hanbury
VLM
20
387
0
14 Apr 2021
Overview of the TREC 2020 deep learning track
Nick Craswell
Bhaskar Mitra
Emine Yilmaz
Daniel Fernando Campos
54
369
0
15 Feb 2021
1