Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1911.03894
Cited By
CamemBERT: a Tasty French Language Model
10 November 2019
Louis Martin
Benjamin Muller
Pedro Ortiz Suarez
Yoann Dupont
Laurent Romary
Eric Villemonte de la Clergerie
Djamé Seddah
Benoît Sagot
Re-assign community
ArXiv
PDF
HTML
Papers citing
"CamemBERT: a Tasty French Language Model"
50 / 361 papers shown
Title
Are the Multilingual Models Better? Improving Czech Sentiment with Transformers
Pavel Přibáň
J. Steinberger
36
11
0
24 Aug 2021
UzBERT: pretraining a BERT model for Uzbek
B. Mansurov
A. Mansurov
VLM
18
13
0
22 Aug 2021
AMMUS : A Survey of Transformer-based Pretrained Models in Natural Language Processing
Katikapalli Subramanyam Kalyan
A. Rajasekharan
S. Sangeetha
VLM
LM&MA
26
261
0
12 Aug 2021
PyEuroVoc: A Tool for Multilingual Legal Document Classification with EuroVoc Descriptors
Andrei-Marius Avram
V. Pais
D. Tufis
AILaw
VLM
24
17
0
02 Aug 2021
Context-aware Adversarial Training for Name Regularity Bias in Named Entity Recognition
Abbas Ghaddar
Philippe Langlais
Ahmad Rashid
Mehdi Rezagholizadeh
39
42
0
24 Jul 2021
Evaluation of contextual embeddings on less-resourced languages
Matej Ulvcar
Alevs vZagar
C. S. Armendariz
Andravz Repar
Senja Pollak
Matthew Purver
Marko Robnik-vSikonja
36
11
0
22 Jul 2021
Comparison of Czech Transformers on Text Classification Tasks
Jan Lehevcka
Jan vSvec
VLM
30
13
0
21 Jul 2021
Are Multilingual Models the Best Choice for Moderately Under-resourced Languages? A Comprehensive Assessment for Catalan
Jordi Armengol-Estapé
C. Carrino
Carlos Rodríguez-Penagos
Ona de Gibert Bonet
Carme Armentano-Oller
Aitor Gonzalez-Agirre
Maite Melero
Marta Villegas
68
42
0
16 Jul 2021
MarIA: Spanish Language Models
Asier Gutiérrez-Fandiño
Jordi Armengol-Estapé
Marc Pàmies
Joan Llop-Palao
Joaquín Silveira-Ocampo
C. Carrino
Aitor Gonzalez-Agirre
Carme Armentano-Oller
Carlos Rodríguez-Penagos
Marta Villegas
VLM
24
119
0
15 Jul 2021
A Review of Bangla Natural Language Processing Tasks and the Utility of Transformer Models
Firoj Alam
Md. Arid Hasan
Tanvirul Alam
A. Khan
Janntatul Tajrin
Naira Khan
Shammur A. Chowdhury
LM&MA
27
23
0
08 Jul 2021
COMBO: a new module for EUD parsing
Mateusz Klimaszewski
Alina Wróblewska
MoE
25
4
0
08 Jul 2021
Enhanced Universal Dependency Parsing with Automated Concatenation of Embeddings
Xinyu Wang
Zixia Jia
Yong-jia Jiang
Kewei Tu
36
6
0
06 Jul 2021
A Primer on Pretrained Multilingual Language Models
Sumanth Doddapaneni
Gowtham Ramesh
Mitesh M. Khapra
Anoop Kunchukuttan
Pratyush Kumar
LRM
43
74
0
01 Jul 2021
GlyphCRM: Bidirectional Encoder Representation for Chinese Character with its Glyph
Yunxin Li
Yu Zhao
Baotian Hu
Qingcai Chen
Yang Xiang
Xiaolong Wang
Yuxin Ding
Lin Ma
22
7
0
01 Jul 2021
Where are we in semantic concept extraction for Spoken Language Understanding?
Sahar Ghannay
Antoine Caubrière
Salima Mdhaffar
G. Laperriere
Bassam Jabaian
Yannick Esteve
12
18
0
24 Jun 2021
Evaluating Various Tokenizers for Arabic Text Classification
Zaid Alyafeai
Maged S. Al-Shaibani
Mustafa Ghaleb
Irfan Ahmad
37
41
0
14 Jun 2021
MergeDistill: Merging Pre-trained Language Models using Distillation
Simran Khanuja
Melvin Johnson
Partha P. Talukdar
32
16
0
05 Jun 2021
Evaluating the Efficacy of Summarization Evaluation across Languages
Fajri Koto
Jey Han Lau
Timothy Baldwin
50
19
0
02 Jun 2021
belabBERT: a Dutch RoBERTa-based language model applied to psychiatric classification
J. Wouts
J. D. Boer
A. Voppel
S. Brederoo
S. V. Splunter
I. Sommer
22
4
0
02 Jun 2021
RobeCzech: Czech RoBERTa, a monolingual contextualized language representation model
Milan Straka
Jakub Náplava
Jana Straková
David Samuel
31
47
0
24 May 2021
De-identification of Privacy-related Entities in Job Postings
Kristian Nørgaard Jensen
Mike Zhang
Barbara Plank
11
15
0
24 May 2021
A systematic review of Hate Speech automatic detection using Natural Language Processing
Md Saroar Jahan
Mourad Oussalah
37
9
0
22 May 2021
Evaluation Of Word Embeddings From Large-Scale French Web Content
Hadi Abdine
Christos Xypolopoulos
Moussa Kamal Eddine
Michalis Vazirgiannis
35
6
0
05 May 2021
HerBERT: Efficiently Pretrained Transformer-based Language Model for Polish
Robert Mroczkowski
Piotr Rybak
Alina Wróblewska
Ireneusz Gawlik
36
81
0
04 May 2021
MAGMA: An Optimization Framework for Mapping Multiple DNNs on Multiple Accelerator Cores
Sheng-Chun Kao
T. Krishna
24
50
0
28 Apr 2021
BERTić -- The Transformer Language Model for Bosnian, Croatian, Montenegrin and Serbian
N. Ljubešić
D. Lauc
16
48
0
19 Apr 2021
AMMU : A Survey of Transformer-based Biomedical Pretrained Language Models
Katikapalli Subramanyam Kalyan
A. Rajasekharan
S. Sangeetha
LM&MA
MedIm
26
164
0
16 Apr 2021
How to Train BERT with an Academic Budget
Peter Izsak
Moshe Berchansky
Omer Levy
23
113
0
15 Apr 2021
DATE: Detecting Anomalies in Text via Self-Supervision of Transformers
Andrei Manolache
Florin Brad
Elena Burceanu
UQCV
38
33
0
12 Apr 2021
FreSaDa: A French Satire Data Set for Cross-Domain Satire Detection
Radu Tudor Ionescu
Adrian-Gabriel Chifu
24
11
0
10 Apr 2021
Detecting of a Patient's Condition From Clinical Narratives Using Natural Language Representation
Thanh-Dung Le
R. Noumeir
J. Rambaud
Guillaume Sans
P. Jouvet
13
18
0
08 Apr 2021
Distantly Supervised Transformers For E-Commerce Product QA
Happy Mittal
Aniket Chakrabarti
Belhassen Bayar
Animesh Sharma
Nikhil Rasiwasia
RALM
13
5
0
07 Apr 2021
Effect of depth order on iterative nested named entity recognition models
Perceval Wajsburt
Yoann Taillé
Xavier Tannier
16
2
0
02 Apr 2021
Czert -- Czech BERT-like Model for Language Representation
Jakub Sido
O. Pražák
P. Pribán
Jan Pasek
Michal Seják
Miloslav Konopík
31
43
0
24 Mar 2021
Quality at a Glance: An Audit of Web-Crawled Multilingual Datasets
Julia Kreutzer
Isaac Caswell
Lisa Wang
Ahsan Wahab
D. Esch
...
Duygu Ataman
Orevaoghene Ahia
Oghenefego Ahia
Sweta Agrawal
Mofetoluwa Adeyemi
20
267
0
22 Mar 2021
RUBERT: A Bilingual Roman Urdu BERT Using Cross Lingual Transfer Learning
Usama Khalid
M. O. Beg
Muhammad Umair Arshad
18
11
0
22 Feb 2021
Bilingual Language Modeling, A transfer learning technique for Roman Urdu
Usama Khalid
M. O. Beg
Muhammad Umair Arshad
24
3
0
22 Feb 2021
Pre-Training BERT on Arabic Tweets: Practical Considerations
Ahmed Abdelali
Sabit Hassan
Hamdy Mubarak
Kareem Darwish
Younes Samih
25
96
0
21 Feb 2021
Fine-tuning BERT-based models for Plant Health Bulletin Classification
Shufan Jiang
Rafael Angarita
Stéphane Cormier
Francis Rousseaux
11
2
0
29 Jan 2021
KoreALBERT: Pretraining a Lite BERT Model for Korean Language Understanding
HyunJae Lee
Jaewoong Yoon
Bonggyu Hwang
Seongho Joe
Seungjai Min
Youngjune Gwon
SSeg
31
16
0
27 Jan 2021
WangchanBERTa: Pretraining transformer-based Thai Language Models
Lalita Lowphansirikul
Charin Polpanumas
Nawat Jantrakulchai
Sarana Nutanong
11
74
0
24 Jan 2021
EfficientQA : a RoBERTa Based Phrase-Indexed Question-Answering System
Sofian Chaybouti
Achraf Saghe
A. Shabou
RALM
39
8
0
06 Jan 2021
Superbizarre Is Not Superb: Derivational Morphology Improves BERT's Interpretation of Complex Words
Valentin Hofmann
J. Pierrehumbert
Hinrich Schütze
19
69
0
02 Jan 2021
BanglaBERT: Language Model Pretraining and Benchmarks for Low-Resource Language Understanding Evaluation in Bangla
Abhik Bhattacharjee
Tahmid Hasan
Wasi Uddin Ahmad
Kazi Samin Mubasshir
Md. Saiful Islam
Anindya Iqbal
M. Rahman
Rifat Shahriyar
SSL
VLM
25
166
0
01 Jan 2021
How Good is Your Tokenizer? On the Monolingual Performance of Multilingual Language Models
Phillip Rust
Jonas Pfeiffer
Ivan Vulić
Sebastian Ruder
Iryna Gurevych
80
235
0
31 Dec 2020
ARBERT & MARBERT: Deep Bidirectional Transformers for Arabic
Muhammad Abdul-Mageed
AbdelRahim Elmadany
El Moatez Billah Nagoudi
VLM
62
448
0
27 Dec 2020
Leveraging ParsBERT and Pretrained mT5 for Persian Abstractive Text Summarization
Mehrdad Farahani
Mohammad Gharachorloo
Mohammad Manthouri
25
23
0
21 Dec 2020
Towards Grad-CAM Based Explainability in a Legal Text Processing Pipeline
Lukasz Górski
Shashishekar Ramakrishna
J. Nowosielski
AILaw
10
16
0
15 Dec 2020
Fine-tuning BERT for Low-Resource Natural Language Understanding via Active Learning
Daniel Grießhaber
J. Maucher
Ngoc Thang Vu
19
46
0
04 Dec 2020
GottBERT: a pure German Language Model
Raphael Scheible
Fabian Thomczyk
P. Tippmann
V. Jaravine
M. Boeker
VLM
19
76
0
03 Dec 2020
Previous
1
2
3
4
5
6
7
8
Next