ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2009.07531
  4. Cited By
Simplified TinyBERT: Knowledge Distillation for Document Retrieval

Simplified TinyBERT: Knowledge Distillation for Document Retrieval

16 September 2020
Xuanang Chen
Xianpei Han
Kai Hui
Le Sun
Yingfei Sun
ArXivPDFHTML

Papers citing "Simplified TinyBERT: Knowledge Distillation for Document Retrieval"

5 / 5 papers shown
Title
ED2LM: Encoder-Decoder to Language Model for Faster Document Re-ranking
  Inference
ED2LM: Encoder-Decoder to Language Model for Faster Document Re-ranking Inference
Kai Hui
Honglei Zhuang
Tao Chen
Zhen Qin
Jing Lu
...
Ji Ma
Jai Gupta
Cicero Nogueira dos Santos
Yi Tay
Donald Metzler
34
16
0
25 Apr 2022
Do Lessons from Metric Learning Generalize to Image-Caption Retrieval?
Do Lessons from Metric Learning Generalize to Image-Caption Retrieval?
Maurits J. R. Bleeker
Maarten de Rijke
SSL
DML
29
9
0
14 Feb 2022
Multi-Modality Distillation via Learning the teacher's modality-level
  Gram Matrix
Multi-Modality Distillation via Learning the teacher's modality-level Gram Matrix
Peng Liu
24
0
0
21 Dec 2021
Efficiently Teaching an Effective Dense Retriever with Balanced Topic
  Aware Sampling
Efficiently Teaching an Effective Dense Retriever with Balanced Topic Aware Sampling
Sebastian Hofstatter
Sheng-Chieh Lin
Jheng-Hong Yang
Jimmy J. Lin
Allan Hanbury
VLM
20
387
0
14 Apr 2021
Overview of the TREC 2020 deep learning track
Overview of the TREC 2020 deep learning track
Nick Craswell
Bhaskar Mitra
Emine Yilmaz
Daniel Fernando Campos
54
371
0
15 Feb 2021
1