ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2110.02432
  4. Cited By
KNOT: Knowledge Distillation using Optimal Transport for Solving NLP
  Tasks

KNOT: Knowledge Distillation using Optimal Transport for Solving NLP Tasks

6 October 2021
Rishabh Bhardwaj
Tushar Vaidya
Soujanya Poria
    OT
    FedML
ArXivPDFHTML

Papers citing "KNOT: Knowledge Distillation using Optimal Transport for Solving NLP Tasks"

5 / 5 papers shown
Title
Towards Cross-Tokenizer Distillation: the Universal Logit Distillation Loss for LLMs
Towards Cross-Tokenizer Distillation: the Universal Logit Distillation Loss for LLMs
Nicolas Boizard
Kevin El Haddad
C´eline Hudelot
Pierre Colombo
75
14
0
28 Jan 2025
Multi-Level Optimal Transport for Universal Cross-Tokenizer Knowledge Distillation on Language Models
Multi-Level Optimal Transport for Universal Cross-Tokenizer Knowledge Distillation on Language Models
Xiao Cui
Mo Zhu
Yulei Qin
Liang Xie
Wengang Zhou
Hao Li
83
4
0
19 Dec 2024
Optimal Transport Guided Correlation Assignment for Multimodal Entity
  Linking
Optimal Transport Guided Correlation Assignment for Multimodal Entity Linking
Zefeng Zhang
Jiawei Sheng
Chuang Zhang
Yunzhi Liang
Wenyuan Zhang
Siqi Wang
Tingwen Liu
OT
29
2
0
04 Jun 2024
Language Models are Homer Simpson! Safety Re-Alignment of Fine-tuned
  Language Models through Task Arithmetic
Language Models are Homer Simpson! Safety Re-Alignment of Fine-tuned Language Models through Task Arithmetic
Rishabh Bhardwaj
Do Duc Anh
Soujanya Poria
MoMe
50
36
0
19 Feb 2024
A Survey on Bias and Fairness in Machine Learning
A Survey on Bias and Fairness in Machine Learning
Ninareh Mehrabi
Fred Morstatter
N. Saxena
Kristina Lerman
Aram Galstyan
SyDa
FaML
323
4,212
0
23 Aug 2019
1