Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2305.03788
Cited By
Harnessing the Power of BERT in the Turkish Clinical Domain: Pretraining Approaches for Limited Data Scenarios
5 May 2023
Hazal Türkmen
Oğuz Dikenelli
C. Eraslan
Mehmet Cem Çalli
S. Özbek
Re-assign community
ArXiv (abs)
PDF
HTML
Papers citing
"Harnessing the Power of BERT in the Turkish Clinical Domain: Pretraining Approaches for Limited Data Scenarios"
7 / 7 papers shown
Title
Domain-Specific Language Model Pretraining for Biomedical Natural Language Processing
Yu Gu
Robert Tinn
Hao Cheng
Michael R. Lucas
Naoto Usuyama
Xiaodong Liu
Tristan Naumann
Jianfeng Gao
Hoifung Poon
LM&MA
AI4CE
85
1,768
0
31 Jul 2020
Don't Stop Pretraining: Adapt Language Models to Domains and Tasks
Suchin Gururangan
Ana Marasović
Swabha Swayamdipta
Kyle Lo
Iz Beltagy
Doug Downey
Noah A. Smith
VLM
AI4CE
CLL
155
2,428
0
23 Apr 2020
Publicly Available Clinical BERT Embeddings
Emily Alsentzer
John R. Murphy
Willie Boag
W. Weng
Di Jin
Tristan Naumann
Matthew B. A. McDermott
AI4MH
158
1,979
0
06 Apr 2019
SciBERT: A Pretrained Language Model for Scientific Text
Iz Beltagy
Kyle Lo
Arman Cohan
139
2,974
0
26 Mar 2019
BioBERT: a pre-trained biomedical language representation model for biomedical text mining
Jinhyuk Lee
Wonjin Yoon
Sungdong Kim
Donghyeon Kim
Sunkyu Kim
Chan Ho So
Jaewoo Kang
OOD
156
5,659
0
25 Jan 2019
Deep contextualized word representations
Matthew E. Peters
Mark Neumann
Mohit Iyyer
Matt Gardner
Christopher Clark
Kenton Lee
Luke Zettlemoyer
NAI
214
11,556
0
15 Feb 2018
Distributed Representations of Words and Phrases and their Compositionality
Tomas Mikolov
Ilya Sutskever
Kai Chen
G. Corrado
J. Dean
NAI
OCL
397
33,550
0
16 Oct 2013
1