ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1912.00147
  4. Cited By
Integrating Graph Contextualized Knowledge into Pre-trained Language
  Models

Integrating Graph Contextualized Knowledge into Pre-trained Language Models

30 November 2019
Bin He
Di Zhou
Jinghui Xiao
Xin jiang
Qun Liu
Nicholas Jing Yuan
Tong Xu
    AI4CE
ArXivPDFHTML

Papers citing "Integrating Graph Contextualized Knowledge into Pre-trained Language Models"

13 / 13 papers shown
Title
Question Answering with Deep Neural Networks for Semi-Structured
  Heterogeneous Genealogical Knowledge Graphs
Question Answering with Deep Neural Networks for Semi-Structured Heterogeneous Genealogical Knowledge Graphs
Omri Suissa
M. Zhitomirsky-Geffet
Avshalom Elmalech
GNN
BDL
34
8
0
30 Jul 2023
Towards Medical Artificial General Intelligence via Knowledge-Enhanced
  Multimodal Pretraining
Towards Medical Artificial General Intelligence via Knowledge-Enhanced Multimodal Pretraining
Bingqian Lin
Zicong Chen
Mingjie Li
Haokun Lin
Hang Xu
...
Ling-Hao Chen
Xiaojun Chang
Yi Yang
L. Xing
Xiaodan Liang
LM&MA
MedIm
AI4CE
40
14
0
26 Apr 2023
A Retrieve-and-Read Framework for Knowledge Graph Link Prediction
A Retrieve-and-Read Framework for Knowledge Graph Link Prediction
Vardaan Pahuja
Boshi Wang
Hugo Latapie
Jayanth Srinivasa
Yu-Chuan Su
28
13
0
19 Dec 2022
Deep Bidirectional Language-Knowledge Graph Pretraining
Deep Bidirectional Language-Knowledge Graph Pretraining
Michihiro Yasunaga
Antoine Bosselut
Hongyu Ren
Xikun Zhang
Christopher D. Manning
Percy Liang
J. Leskovec
36
193
0
17 Oct 2022
Cross-modal Clinical Graph Transformer for Ophthalmic Report Generation
Cross-modal Clinical Graph Transformer for Ophthalmic Report Generation
Mingjie Li
Wenjia Cai
Karin Verspoor
Shirui Pan
Xiaodan Liang
Xiaojun Chang
MedIm
36
35
0
04 Jun 2022
LinkBERT: Pretraining Language Models with Document Links
LinkBERT: Pretraining Language Models with Document Links
Michihiro Yasunaga
J. Leskovec
Percy Liang
KELM
29
353
0
29 Mar 2022
Coreference Resolution for the Biomedical Domain: A Survey
Coreference Resolution for the Biomedical Domain: A Survey
Pengcheng Lu
Massimo Poesio
34
15
0
25 Sep 2021
K-AID: Enhancing Pre-trained Language Models with Domain Knowledge for
  Question Answering
K-AID: Enhancing Pre-trained Language Models with Domain Knowledge for Question Answering
Fu Sun
Feng-Lin Li
Ruize Wang
Qianglong Chen
Xingyi Cheng
Ji Zhang
VLM
KELM
36
4
0
22 Sep 2021
Mixed Attention Transformer for Leveraging Word-Level Knowledge to
  Neural Cross-Lingual Information Retrieval
Mixed Attention Transformer for Leveraging Word-Level Knowledge to Neural Cross-Lingual Information Retrieval
Zhiqi Huang
Hamed Bonab
Sheikh Muhammad Sarwar
Razieh Rahimi
James Allan
40
10
0
07 Sep 2021
Combining pre-trained language models and structured knowledge
Combining pre-trained language models and structured knowledge
Pedro Colon-Hernandez
Catherine Havasi
Jason B. Alonso
Matthew Huggins
C. Breazeal
KELM
38
48
0
28 Jan 2021
CoLAKE: Contextualized Language and Knowledge Embedding
CoLAKE: Contextualized Language and Knowledge Embedding
Tianxiang Sun
Yunfan Shao
Xipeng Qiu
Qipeng Guo
Yaru Hu
Xuanjing Huang
Zheng-Wei Zhang
KELM
33
181
0
01 Oct 2020
Pre-trained Models for Natural Language Processing: A Survey
Pre-trained Models for Natural Language Processing: A Survey
Xipeng Qiu
Tianxiang Sun
Yige Xu
Yunfan Shao
Ning Dai
Xuanjing Huang
LM&MA
VLM
243
1,452
0
18 Mar 2020
K-Adapter: Infusing Knowledge into Pre-Trained Models with Adapters
K-Adapter: Infusing Knowledge into Pre-Trained Models with Adapters
Ruize Wang
Duyu Tang
Nan Duan
Zhongyu Wei
Xuanjing Huang
Jianshu Ji
Guihong Cao
Daxin Jiang
Ming Zhou
KELM
22
545
0
05 Feb 2020
1