Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2003.13821
Cited By
v1
v2 (latest)
NukeBERT: A Pre-trained language model for Low Resource Nuclear Domain
30 March 2020
Ayush Jain
Meenachi Ganesamoorty
Dr. B. Venkatraman
Re-assign community
ArXiv (abs)
PDF
HTML
Papers citing
"NukeBERT: A Pre-trained language model for Low Resource Nuclear Domain"
7 / 7 papers shown
Title
Efficient, Lexicon-Free OCR using Deep Learning
Marcin Namysl
I. Konya
3DV
28
35
0
05 Jun 2019
SciBERT: A Pretrained Language Model for Scientific Text
Iz Beltagy
Kyle Lo
Arman Cohan
168
2,986
0
26 Mar 2019
BioBERT: a pre-trained biomedical language representation model for biomedical text mining
Jinhyuk Lee
Wonjin Yoon
Sungdong Kim
Donghyeon Kim
Sunkyu Kim
Chan Ho So
Jaewoo Kang
OOD
182
5,674
0
25 Jan 2019
BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
Jacob Devlin
Ming-Wei Chang
Kenton Lee
Kristina Toutanova
VLM
SSL
SSeg
1.8K
95,229
0
11 Oct 2018
Deep contextualized word representations
Matthew E. Peters
Mark Neumann
Mohit Iyyer
Matt Gardner
Christopher Clark
Kenton Lee
Luke Zettlemoyer
NAI
233
11,565
0
15 Feb 2018
SQuAD: 100,000+ Questions for Machine Comprehension of Text
Pranav Rajpurkar
Jian Zhang
Konstantin Lopyrev
Percy Liang
RALM
316
8,174
0
16 Jun 2016
Efficient Estimation of Word Representations in Vector Space
Tomas Mikolov
Kai Chen
G. Corrado
J. Dean
3DV
693
31,553
0
16 Jan 2013
1