Communities
Connect sessions
AI calendar
Organizations
Join Slack
Contact Sales
Search
Open menu
Home
Papers
2507.20783
Cited By
v1
v2 (latest)
On The Role of Pretrained Language Models in General-Purpose Text Embeddings: A Survey
28 July 2025
Meishan Zhang
Xin Zhang
X. Zhao
Shouzheng Huang
Baotian Hu
Min Zhang
Re-assign community
ArXiv (abs)
PDF
HTML
Github (40835★)
Papers citing
"On The Role of Pretrained Language Models in General-Purpose Text Embeddings: A Survey"
2 / 2 papers shown
Title
Revealing the Numeracy Gap: An Empirical Investigation of Text Embedding Models
Ningyuan Deng
Hanyu Duan
Yixuan Tang
Yi Yang
52
0
0
06 Sep 2025
KaLM-Embedding-V2: Superior Training Techniques and Data Inspire A Versatile Embedding Model
Xinping Zhao
Xinshuo Hu
Zifei Shan
Shouzheng Huang
Yao Zhou
...
Meishan Zhang
Haofen Wang
Jun-chen Yu
Baotian Hu
Min Zhang
VLM
323
2
0
26 Jun 2025
1