Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2109.10234
Cited By
BERTweetFR : Domain Adaptation of Pre-Trained Language Models for French Tweets
21 September 2021
Yanzhu Guo
Virgile Rennard
Christos Xypolopoulos
Michalis Vazirgiannis
VLM
AI4CE
Re-assign community
ArXiv (abs)
PDF
HTML
Papers citing
"BERTweetFR : Domain Adaptation of Pre-Trained Language Models for French Tweets"
8 / 8 papers shown
Title
BERTweet: A pre-trained language model for English Tweets
Dat Quoc Nguyen
Thanh Tien Vu
A. Nguyen
VLM
99
919
0
20 May 2020
Don't Stop Pretraining: Adapt Language Models to Domains and Tasks
Suchin Gururangan
Ana Marasović
Swabha Swayamdipta
Kyle Lo
Iz Beltagy
Doug Downey
Noah A. Smith
VLM
AI4CE
CLL
164
2,435
0
23 Apr 2020
FlauBERT: Unsupervised Language Model Pre-training for French
Hang Le
Loïc Vial
Jibril Frej
Vincent Segonne
Maximin Coavoux
Benjamin Lecouteux
A. Allauzen
Benoît Crabbé
Laurent Besacier
D. Schwab
AI4CE
88
400
0
11 Dec 2019
CamemBERT: a Tasty French Language Model
Louis Martin
Benjamin Muller
Pedro Ortiz Suarez
Yoann Dupont
Laurent Romary
Eric Villemonte de la Clergerie
Djamé Seddah
Benoît Sagot
117
975
0
10 Nov 2019
SentencePiece: A simple and language independent subword tokenizer and detokenizer for Neural Text Processing
Taku Kudo
John Richardson
204
3,528
0
19 Aug 2018
Subword Regularization: Improving Neural Network Translation Models with Multiple Subword Candidates
Taku Kudo
226
1,173
0
29 Apr 2018
Unsupervised Pretraining for Sequence to Sequence Learning
Prajit Ramachandran
Peter J. Liu
Quoc V. Le
SSL
AIMat
102
282
0
08 Nov 2016
Distributed Representations of Words and Phrases and their Compositionality
Tomas Mikolov
Ilya Sutskever
Kai Chen
G. Corrado
J. Dean
NAI
OCL
402
33,560
0
16 Oct 2013
1