ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2106.05691
  4. Cited By
Marginal Utility Diminishes: Exploring the Minimum Knowledge for BERT
  Knowledge Distillation

Marginal Utility Diminishes: Exploring the Minimum Knowledge for BERT Knowledge Distillation

10 June 2021
Yuanxin Liu
Fandong Meng
Zheng Lin
Weiping Wang
Jie Zhou
ArXiv (abs)PDFHTML

Papers citing "Marginal Utility Diminishes: Exploring the Minimum Knowledge for BERT Knowledge Distillation"

1 / 1 papers shown
Title
A Win-win Deal: Towards Sparse and Robust Pre-trained Language Models
A Win-win Deal: Towards Sparse and Robust Pre-trained Language Models
Yuanxin Liu
Fandong Meng
Zheng Lin
JiangNan Li
Peng Fu
Yanan Cao
Weiping Wang
Jie Zhou
87
6
0
11 Oct 2022
1