ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2505.07863
  4. Cited By
QoSBERT: An Uncertainty-Aware Approach based on Pre-trained Language Models for Service Quality Prediction

QoSBERT: An Uncertainty-Aware Approach based on Pre-trained Language Models for Service Quality Prediction

9 May 2025
Ziliang Wang
Xiaohong Zhang
Ze Shi Li
Meng Yan
ArXiv (abs)PDFHTML

Papers citing "QoSBERT: An Uncertainty-Aware Approach based on Pre-trained Language Models for Service Quality Prediction"

4 / 4 papers shown
Title
Domain-Adaptive Pretraining Methods for Dialogue Understanding
Domain-Adaptive Pretraining Methods for Dialogue Understanding
Han Wu
Kun Xu
Linfeng Song
Lifeng Jin
Haisong Zhang
Linqi Song
AI4CE
47
18
0
28 May 2021
What's so special about BERT's layers? A closer look at the NLP pipeline
  in monolingual and multilingual models
What's so special about BERT's layers? A closer look at the NLP pipeline in monolingual and multilingual models
Wietse de Vries
Andreas van Cranenburgh
Malvina Nissim
MILMSSegMoE
140
66
0
14 Apr 2020
CodeBERT: A Pre-Trained Model for Programming and Natural Languages
CodeBERT: A Pre-Trained Model for Programming and Natural Languages
Zhangyin Feng
Daya Guo
Duyu Tang
Nan Duan
Xiaocheng Feng
...
Linjun Shou
Bing Qin
Ting Liu
Daxin Jiang
Ming Zhou
165
2,637
0
19 Feb 2020
RoBERTa: A Robustly Optimized BERT Pretraining Approach
RoBERTa: A Robustly Optimized BERT Pretraining Approach
Yinhan Liu
Myle Ott
Naman Goyal
Jingfei Du
Mandar Joshi
Danqi Chen
Omer Levy
M. Lewis
Luke Zettlemoyer
Veselin Stoyanov
AIMat
665
24,528
0
26 Jul 2019
1