Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2403.11203
Cited By
TRELM: Towards Robust and Efficient Pre-training for Knowledge-Enhanced Language Models
17 March 2024
Junbing Yan
Chengyu Wang
Taolin Zhang
Xiao-Mei He
Junyuan Huang
Longtao Huang
Hui Xue
Wei Zhang
VLM
KELM
Re-assign community
ArXiv
PDF
HTML
Papers citing
"TRELM: Towards Robust and Efficient Pre-training for Knowledge-Enhanced Language Models"
5 / 5 papers shown
Title
KALM: Knowledge-Aware Integration of Local, Document, and Global Contexts for Long Document Understanding
Shangbin Feng
Zhaoxuan Tan
Wenqian Zhang
Zhenyu Lei
Yulia Tsvetkov
KELM
VLM
48
10
0
08 Oct 2022
K-BERT: Enabling Language Representation with Knowledge Graph
Weijie Liu
Peng Zhou
Zhe Zhao
Zhiruo Wang
Qi Ju
Haotang Deng
Ping Wang
231
778
0
17 Sep 2019
Knowledge Enhanced Contextual Word Representations
Matthew E. Peters
Mark Neumann
IV RobertL.Logan
Roy Schwartz
Vidur Joshi
Sameer Singh
Noah A. Smith
231
656
0
09 Sep 2019
Language Models as Knowledge Bases?
Fabio Petroni
Tim Rocktaschel
Patrick Lewis
A. Bakhtin
Yuxiang Wu
Alexander H. Miller
Sebastian Riedel
KELM
AI4MH
415
2,588
0
03 Sep 2019
GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language Understanding
Alex Jinpeng Wang
Amanpreet Singh
Julian Michael
Felix Hill
Omer Levy
Samuel R. Bowman
ELM
297
6,959
0
20 Apr 2018
1