Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2505.07863
Cited By
QoSBERT: An Uncertainty-Aware Approach based on Pre-trained Language Models for Service Quality Prediction
9 May 2025
Ziliang Wang
Xiaohong Zhang
Ze Shi Li
Meng Yan
Re-assign community
ArXiv (abs)
PDF
HTML
Papers citing
"QoSBERT: An Uncertainty-Aware Approach based on Pre-trained Language Models for Service Quality Prediction"
4 / 4 papers shown
Title
Domain-Adaptive Pretraining Methods for Dialogue Understanding
Han Wu
Kun Xu
Linfeng Song
Lifeng Jin
Haisong Zhang
Linqi Song
AI4CE
47
18
0
28 May 2021
What's so special about BERT's layers? A closer look at the NLP pipeline in monolingual and multilingual models
Wietse de Vries
Andreas van Cranenburgh
Malvina Nissim
MILM
SSeg
MoE
140
66
0
14 Apr 2020
CodeBERT: A Pre-Trained Model for Programming and Natural Languages
Zhangyin Feng
Daya Guo
Duyu Tang
Nan Duan
Xiaocheng Feng
...
Linjun Shou
Bing Qin
Ting Liu
Daxin Jiang
Ming Zhou
165
2,637
0
19 Feb 2020
RoBERTa: A Robustly Optimized BERT Pretraining Approach
Yinhan Liu
Myle Ott
Naman Goyal
Jingfei Du
Mandar Joshi
Danqi Chen
Omer Levy
M. Lewis
Luke Zettlemoyer
Veselin Stoyanov
AIMat
665
24,528
0
26 Jul 2019
1