Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1910.03806
Cited By
Is Multilingual BERT Fluent in Language Generation?
9 October 2019
Samuel Rönnqvist
Jenna Kanerva
T. Salakoski
Filip Ginter
Re-assign community
ArXiv (abs)
PDF
HTML
Papers citing
"Is Multilingual BERT Fluent in Language Generation?"
24 / 24 papers shown
Title
Testing the Predictions of Surprisal Theory in 11 Languages
Ethan Gotlieb Wilcox
Tiago Pimentel
Clara Meister
Ryan Cotterell
R. Levy
LRM
165
70
0
07 Jul 2023
Understanding BLOOM: An empirical study on diverse NLP tasks
Parag Dakle
Sai Krishna Rallabandi
Preethi Raghavan
AI4CE
89
4
0
27 Nov 2022
Multilingual Transformer Encoders: a Word-Level Task-Agnostic Evaluation
Félix Gaschi
François Plesse
Parisa Rastin
Y. Toussaint
68
8
0
19 Jul 2022
Word-order typology in Multilingual BERT: A case study in subordinate-clause detection
Dmitry Nikolaev
Sebastian Padó
55
6
0
24 May 2022
State-of-the-art in Open-domain Conversational AI: A Survey
Tosin Adewumi
F. Liwicki
Marcus Liwicki
113
15
0
02 May 2022
You Are What You Write: Preserving Privacy in the Era of Large Language Models
Richard Plant
V. Giuffrida
Dimitra Gkatzia
PILM
97
19
0
20 Apr 2022
Mono vs Multilingual BERT for Hate Speech Detection and Text Classification: A Case Study in Marathi
Abhishek Velankar
H. Patil
Raviraj Joshi
88
33
0
19 Apr 2022
gaBERT -- an Irish Language Model
James Barry
Joachim Wagner
Lauren Cassidy
Alan Cowap
Teresa Lynn
Abigail Walsh
Mícheál J. Ó Meachair
Jennifer Foster
65
18
0
27 Jul 2021
A Primer on Pretrained Multilingual Language Models
Sumanth Doddapaneni
Gowtham Ramesh
Mitesh M. Khapra
Anoop Kunchukuttan
Pratyush Kumar
LRM
119
76
0
01 Jul 2021
Auto-tagging of Short Conversational Sentences using Transformer Methods
D. E. Tasar
¸Sükrü Ozan
Umut Özdil
M. Akca
Oguzhan Ölmez
Semih Gülüm
Seçilay Kutal
Ceren Belhan
21
4
0
03 Jun 2021
Assessing the Syntactic Capabilities of Transformer-based Multilingual Language Models
Laura Pérez-Mayos
Alba Táboas García
Simon Mille
Leo Wanner
ELM
LRM
41
8
0
10 May 2021
Continual Mixed-Language Pre-Training for Extremely Low-Resource Neural Machine Translation
Zihan Liu
Genta Indra Winata
Pascale Fung
VLM
CLL
96
54
0
09 May 2021
Crossing the Conversational Chasm: A Primer on Natural Language Processing for Multilingual Task-Oriented Dialogue Systems
E. Razumovskaia
Goran Glavaš
Olga Majewska
Edoardo Ponti
Anna Korhonen
Ivan Vulić
186
34
0
17 Apr 2021
Re-Evaluating GermEval17 Using German Pre-Trained Language Models
Matthias Aßenmacher
A. Corvonato
C. Heumann
VLM
66
6
0
24 Feb 2021
How Good is Your Tokenizer? On the Monolingual Performance of Multilingual Language Models
Phillip Rust
Jonas Pfeiffer
Ivan Vulić
Sebastian Ruder
Iryna Gurevych
166
256
0
31 Dec 2020
Parsing with Multilingual BERT, a Small Corpus, and a Small Treebank
Ethan C. Chau
Lucy H. Lin
Noah A. Smith
98
15
0
29 Sep 2020
Playing with Words at the National Library of Sweden -- Making a Swedish BERT
Martin Malmsten
Love Borjeson
Chris Haffenden
70
126
0
03 Jul 2020
From Zero to Hero: On the Limitations of Zero-Shot Cross-Lingual Transfer with Multilingual Transformers
Anne Lauscher
Vinit Ravishankar
Ivan Vulić
Goran Glavaš
81
58
0
01 May 2020
Identifying Necessary Elements for BERT's Multilinguality
Philipp Dufter
Hinrich Schütze
75
17
0
01 May 2020
On the Language Neutrality of Pre-trained Multilingual Representations
Jindrich Libovický
Rudolf Rosa
Alexander Fraser
83
107
0
09 Apr 2020
A Primer in BERTology: What we know about how BERT works
Anna Rogers
Olga Kovaleva
Anna Rumshisky
OffRL
143
1,511
0
27 Feb 2020
RobBERT: a Dutch RoBERTa-based Language Model
Pieter Delobelle
Thomas Winters
Bettina Berendt
86
240
0
17 Jan 2020
Multilingual is not enough: BERT for Finnish
Antti Virtanen
Jenna Kanerva
Rami Ilo
Jouni Luoma
Juhani Luotolahti
T. Salakoski
Filip Ginter
S. Pyysalo
88
280
0
15 Dec 2019
How Language-Neutral is Multilingual BERT?
Jindrich Libovický
Rudolf Rosa
Alexander Fraser
90
116
0
08 Nov 2019
1