Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2102.08366
Cited By
Boosting Low-Resource Biomedical QA via Entity-Aware Masking Strategies
16 February 2021
Gabriele Pergola
E. Kochkina
Lin Gui
Maria Liakata
Yulan He
Re-assign community
ArXiv (abs)
PDF
HTML
Papers citing
"Boosting Low-Resource Biomedical QA via Entity-Aware Masking Strategies"
19 / 19 papers shown
Title
A Survey of Large Language Models for Healthcare: from Data, Technology, and Applications to Accountability and Ethics
Kai He
Rui Mao
Qika Lin
Yucheng Ruan
Xiang Lan
Mengling Feng
Min Zhang
LM&MA
AILaw
213
173
0
28 Jan 2025
X-FACTR: Multilingual Factual Knowledge Retrieval from Pretrained Language Models
Zhengbao Jiang
Antonios Anastasopoulos
Jun Araki
Haibo Ding
Graham Neubig
HILM
KELM
67
144
0
13 Oct 2020
Don't Stop Pretraining: Adapt Language Models to Domains and Tasks
Suchin Gururangan
Ana Marasović
Swabha Swayamdipta
Kyle Lo
Iz Beltagy
Doug Downey
Noah A. Smith
VLM
AI4CE
CLL
155
2,428
0
23 Apr 2020
CORD-19: The COVID-19 Open Research Dataset
Lucy Lu Wang
Kyle Lo
Yoganand Chandrasekhar
Russell Reas
Jiangjiang Yang
...
Boya Xie
Douglas A. Raymond
Daniel S. Weld
Oren Etzioni
Sebastian Kohlmeier
98
811
0
22 Apr 2020
Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer
Colin Raffel
Noam M. Shazeer
Adam Roberts
Katherine Lee
Sharan Narang
Michael Matena
Yanqi Zhou
Wei Li
Peter J. Liu
AIMat
445
20,181
0
23 Oct 2019
Pre-trained Language Model for Biomedical Question Answering
Wonjin Yoon
Jinhyuk Lee
Donghyeon Kim
Minbyul Jeong
Jaewoo Kang
AI4MH
59
86
0
18 Sep 2019
Knowledge Enhanced Contextual Word Representations
Matthew E. Peters
Mark Neumann
IV RobertL.Logan
Roy Schwartz
Vidur Joshi
Sameer Singh
Noah A. Smith
279
659
0
09 Sep 2019
ERNIE 2.0: A Continual Pre-training Framework for Language Understanding
Yu Sun
Shuohuan Wang
Yukun Li
Shikun Feng
Hao Tian
Hua Wu
Haifeng Wang
CLL
87
810
0
29 Jul 2019
RoBERTa: A Robustly Optimized BERT Pretraining Approach
Yinhan Liu
Myle Ott
Naman Goyal
Jingfei Du
Mandar Joshi
Danqi Chen
Omer Levy
M. Lewis
Luke Zettlemoyer
Veselin Stoyanov
AIMat
665
24,528
0
26 Jul 2019
SpanBERT: Improving Pre-training by Representing and Predicting Spans
Mandar Joshi
Danqi Chen
Yinhan Liu
Daniel S. Weld
Luke Zettlemoyer
Omer Levy
147
1,965
0
24 Jul 2019
XLNet: Generalized Autoregressive Pretraining for Language Understanding
Zhilin Yang
Zihang Dai
Yiming Yang
J. Carbonell
Ruslan Salakhutdinov
Quoc V. Le
AI4CE
232
8,433
0
19 Jun 2019
ERNIE: Enhanced Language Representation with Informative Entities
Zhengyan Zhang
Xu Han
Zhiyuan Liu
Xin Jiang
Maosong Sun
Qun Liu
109
1,397
0
17 May 2019
SciBERT: A Pretrained Language Model for Scientific Text
Iz Beltagy
Kyle Lo
Arman Cohan
148
2,974
0
26 Mar 2019
ScispaCy: Fast and Robust Models for Biomedical Natural Language Processing
Mark Neumann
Daniel King
Iz Beltagy
Bridger Waleed Ammar
66
684
0
20 Feb 2019
BioBERT: a pre-trained biomedical language representation model for biomedical text mining
Jinhyuk Lee
Wonjin Yoon
Sungdong Kim
Donghyeon Kim
Sunkyu Kim
Chan Ho So
Jaewoo Kang
OOD
167
5,659
0
25 Jan 2019
BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
Jacob Devlin
Ming-Wei Chang
Kenton Lee
Kristina Toutanova
VLM
SSL
SSeg
1.8K
94,891
0
11 Oct 2018
Construction of the Literature Graph in Semantic Scholar
Bridger Waleed Ammar
Dirk Groeneveld
Chandra Bhagavatula
Iz Beltagy
Miles Crawford
...
Lucy Lu Wang
Christopher Wilhelm
Zheng Yuan
Madeleine van Zuylen
Oren Etzioni
GNN
72
396
0
06 May 2018
MS MARCO: A Human Generated MAchine Reading COmprehension Dataset
Payal Bajaj
Daniel Fernando Campos
Nick Craswell
Li Deng
Jianfeng Gao
...
Mir Rosenberg
Xia Song
Alina Stoica
Saurabh Tiwary
Tong Wang
RALM
142
2,728
0
28 Nov 2016
SQuAD: 100,000+ Questions for Machine Comprehension of Text
Pranav Rajpurkar
Jian Zhang
Konstantin Lopyrev
Percy Liang
RALM
289
8,160
0
16 Jun 2016
1