ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2507.20783
  4. Cited By
On The Role of Pretrained Language Models in General-Purpose Text Embeddings: A Survey
v1v2 (latest)

On The Role of Pretrained Language Models in General-Purpose Text Embeddings: A Survey

28 July 2025
Meishan Zhang
Xin Zhang
X. Zhao
Shouzheng Huang
Baotian Hu
Min Zhang
ArXiv (abs)PDFHTMLGithub (40835★)

Papers citing "On The Role of Pretrained Language Models in General-Purpose Text Embeddings: A Survey"

2 / 2 papers shown
Title
Revealing the Numeracy Gap: An Empirical Investigation of Text Embedding Models
Revealing the Numeracy Gap: An Empirical Investigation of Text Embedding Models
Ningyuan Deng
Hanyu Duan
Yixuan Tang
Yi Yang
52
0
0
06 Sep 2025
KaLM-Embedding-V2: Superior Training Techniques and Data Inspire A Versatile Embedding Model
KaLM-Embedding-V2: Superior Training Techniques and Data Inspire A Versatile Embedding Model
Xinping Zhao
Xinshuo Hu
Zifei Shan
Shouzheng Huang
Yao Zhou
...
Meishan Zhang
Haofen Wang
Jun-chen Yu
Baotian Hu
Min Zhang
VLM
323
2
0
26 Jun 2025
1