Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1912.09582
Cited By
BERTje: A Dutch BERT Model
19 December 2019
Wietse de Vries
Andreas van Cranenburgh
Arianna Bisazza
Tommaso Caselli
Gertjan van Noord
Malvina Nissim
VLM
SSeg
Re-assign community
ArXiv
PDF
HTML
Papers citing
"BERTje: A Dutch BERT Model"
28 / 128 papers shown
Title
When Being Unseen from mBERT is just the Beginning: Handling New Languages With Multilingual Language Models
Benjamin Muller
Antonis Anastasopoulos
Benoît Sagot
Djamé Seddah
LRM
136
165
0
24 Oct 2020
Constructing Taxonomies from Pretrained Language Models
Catherine Chen
Kevin Lin
Dan Klein
78
32
0
24 Oct 2020
BARThez: a Skilled Pretrained French Sequence-to-Sequence Model
Moussa Kamal Eddine
A. Tixier
Michalis Vazirgiannis
BDL
103
64
0
23 Oct 2020
mT5: A massively multilingual pre-trained text-to-text transformer
Linting Xue
Noah Constant
Adam Roberts
Mihir Kale
Rami Al-Rfou
Aditya Siddhant
Aditya Barua
Colin Raffel
63
2,450
0
22 Oct 2020
Towards Fully Bilingual Deep Language Modeling
Li-Hsin Chang
S. Pyysalo
Jenna Kanerva
Filip Ginter
34
3
0
22 Oct 2020
X-FACTR: Multilingual Factual Knowledge Retrieval from Pretrained Language Models
Zhengbao Jiang
Antonios Anastasopoulos
Jun Araki
Haibo Ding
Graham Neubig
HILM
KELM
21
138
0
13 Oct 2020
Gender prediction using limited Twitter Data
M. Burghoorn
M. D. Boer
S. Raaijmakers
13
1
0
29 Sep 2020
Neural Proof Nets
Konstantinos Kogkalidis
M. Moortgat
R. Moot
20
11
0
26 Sep 2020
AnchiBERT: A Pre-Trained Model for Ancient ChineseLanguage Understanding and Generation
Huishuang Tian
Kexin Yang
Dayiheng Liu
Jiancheng Lv
33
31
0
24 Sep 2020
The birth of Romanian BERT
Stefan Daniel Dumitrescu
Andrei-Marius Avram
S. Pyysalo
VLM
8
76
0
18 Sep 2020
The ADAPT Enhanced Dependency Parser at the IWPT 2020 Shared Task
James Barry
Joachim Wagner
Jennifer Foster
24
4
0
03 Sep 2020
PTT5: Pretraining and validating the T5 model on Brazilian Portuguese data
Diedre Carmo
Marcos Piau
Israel Campiotti
Rodrigo Nogueira
R. Lotufo
LM&MA
14
52
0
20 Aug 2020
The Unreasonable Effectiveness of Machine Learning in Moldavian versus Romanian Dialect Identification
Mihaela Guaman
Radu Tudor Ionescu
17
13
0
30 Jul 2020
Text-based classification of interviews for mental health -- juxtaposing the state of the art
J. Wouts
17
1
0
29 Jul 2020
Transferring Monolingual Model to Low-Resource Language: The Case of Tigrinya
Abrhalei Tela
Abraham Woubie
Ville Hautamaki
37
12
0
13 Jun 2020
Pre-training Polish Transformer-based Language Models at Scale
Slawomir Dadas
Michal Perelkiewicz
Rafal Poswiata
27
38
0
07 Jun 2020
Exploring Cross-sentence Contexts for Named Entity Recognition with BERT
Jouni Luoma
S. Pyysalo
17
81
0
02 Jun 2020
WikiBERT models: deep transfer learning for many languages
S. Pyysalo
Jenna Kanerva
Antti Virtanen
Filip Ginter
KELM
33
38
0
02 Jun 2020
ParsBERT: Transformer-based Model for Persian Language Understanding
Mehrdad Farahani
Mohammad Gharachorloo
Marzieh Farahani
Mohammad Manthouri
14
199
0
26 May 2020
Fighting the COVID-19 Infodemic: Modeling the Perspective of Journalists, Fact-Checkers, Social Media Platforms, Policy Makers, and the Society
Firoj Alam
Shaden Shaar
Fahim Dalvi
Hassan Sajjad
Alex Nikolov
...
Tommaso Caselli
Gijs Danoe
Friso Stolk
Britt Bruntink
Preslav Nakov
35
156
0
30 Apr 2020
What's so special about BERT's layers? A closer look at the NLP pipeline in monolingual and multilingual models
Wietse de Vries
Andreas van Cranenburgh
Malvina Nissim
MILM
SSeg
MoE
22
64
0
14 Apr 2020
Pre-trained Models for Natural Language Processing: A Survey
Xipeng Qiu
Tianxiang Sun
Yige Xu
Yunfan Shao
Ning Dai
Xuanjing Huang
LM&MA
VLM
243
1,452
0
18 Mar 2020
What the [MASK]? Making Sense of Language-Specific BERT Models
Debora Nozza
Federico Bianchi
Dirk Hovy
92
106
0
05 Mar 2020
PhoBERT: Pre-trained language models for Vietnamese
Dat Quoc Nguyen
A. Nguyen
174
343
0
02 Mar 2020
AraBERT: Transformer-based Model for Arabic Language Understanding
Wissam Antoun
Fady Baly
Hazem M. Hajj
46
941
0
28 Feb 2020
FQuAD: French Question Answering Dataset
Martin d'Hoffschmidt
Wacim Belblidia
Tom Brendlé
Quentin Heinrich
Maxime Vidal
26
98
0
14 Feb 2020
RobBERT: a Dutch RoBERTa-based Language Model
Pieter Delobelle
Thomas Winters
Bettina Berendt
12
233
0
17 Jan 2020
FlauBERT: Unsupervised Language Model Pre-training for French
Hang Le
Loïc Vial
Jibril Frej
Vincent Segonne
Maximin Coavoux
Benjamin Lecouteux
A. Allauzen
Benoît Crabbé
Laurent Besacier
D. Schwab
AI4CE
49
395
0
11 Dec 2019
Previous
1
2
3