ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2505.19722
  4. Cited By
Distilling Closed-Source LLM's Knowledge for Locally Stable and Economic Biomedical Entity Linking

Distilling Closed-Source LLM's Knowledge for Locally Stable and Economic Biomedical Entity Linking

26 May 2025
Yihao Ai
Zhiyuan Ning
Weiwei Dai
P. Wang
Yi Du
Wenjuan Cui
Kunpeng Liu
Yuanchun Zhou
ArXiv (abs)PDFHTML

Papers citing "Distilling Closed-Source LLM's Knowledge for Locally Stable and Economic Biomedical Entity Linking"

2 / 2 papers shown
Title
Rethinking Graph Contrastive Learning through Relative Similarity Preservation
Rethinking Graph Contrastive Learning through Relative Similarity Preservation
Zhiyuan Ning
Meng Xiao
Ziyue Qiao
Pengyang Wang
Yuanchun Zhou
468
1
0
08 May 2025
m-KAILIN: Knowledge-Driven Agentic Scientific Corpus Distillation Framework for Biomedical Large Language Models Training
m-KAILIN: Knowledge-Driven Agentic Scientific Corpus Distillation Framework for Biomedical Large Language Models Training
Meng Xiao
Xunxin Cai
Chengrui Wang
Yuanchun Zhou
Yuanchun Zhou
Hengshu Zhu
115
0
0
28 Apr 2025
1