Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2109.01048
Cited By
Pre-training Language Model Incorporating Domain-specific Heterogeneous Knowledge into A Unified Representation
2 September 2021
Hongyin Zhu
Hao Peng
Zhiheng Lyu
Lei Hou
Juan-Zi Li
Jinghui Xiao
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Pre-training Language Model Incorporating Domain-specific Heterogeneous Knowledge into A Unified Representation"
7 / 7 papers shown
Title
MEG: Medical Knowledge-Augmented Large Language Models for Question Answering
Laura Cabello
Carmen Martin-Turrero
Uchenna Akujuobi
Anders Søgaard
Carlos Bobed
AI4MH
154
1
0
06 Nov 2024
HippoRAG: Neurobiologically Inspired Long-Term Memory for Large Language Models
Bernal Jiménez Gutiérrez
Yiheng Shu
Yu Gu
Michihiro Yasunaga
Yu-Chuan Su
RALM
CLL
68
33
0
23 May 2024
DrBERT: A Robust Pre-trained Model in French for Biomedical and Clinical domains
Yanis Labrak
Adrien Bazoge
Richard Dufour
Mickael Rouvier
Emmanuel Morin
B. Daille
P. Gourraud
LM&MA
22
54
0
03 Apr 2023
Industry Risk Assessment via Hierarchical Financial Data Using Stock Market Sentiment Indicators
Hongyin Zhu
AIFin
17
4
0
05 Mar 2023
K-12BERT: BERT for K-12 education
Vasu Goel
Dhruv Sahnan
Venktesh V
Gaurav Sharma
Deep Dwivedi
Mukesh Mohania
AI4Ed
22
5
0
24 May 2022
K-BERT: Enabling Language Representation with Knowledge Graph
Weijie Liu
Peng Zhou
Zhe Zhao
Zhiruo Wang
Qi Ju
Haotang Deng
Ping Wang
231
778
0
17 Sep 2019
Language Models as Knowledge Bases?
Fabio Petroni
Tim Rocktaschel
Patrick Lewis
A. Bakhtin
Yuxiang Wu
Alexander H. Miller
Sebastian Riedel
KELM
AI4MH
419
2,588
0
03 Sep 2019
1