Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2110.13658
Cited By
Can Character-based Language Models Improve Downstream Task Performance in Low-Resource and Noisy Language Scenarios?
26 October 2021
Arij Riabi
Benoît Sagot
Djamé Seddah
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Can Character-based Language Models Improve Downstream Task Performance in Low-Resource and Noisy Language Scenarios?"
7 / 7 papers shown
Title
Does Manipulating Tokenization Aid Cross-Lingual Transfer? A Study on POS Tagging for Non-Standardized Languages
Verena Blaschke
Hinrich Schütze
Barbara Plank
36
14
0
20 Apr 2023
Multilingual Auxiliary Tasks Training: Bridging the Gap between Languages for Zero-Shot Transfer of Hate Speech Detection Models
Syrielle Montariol
Arij Riabi
Djamé Seddah
27
10
0
24 Oct 2022
Language Modelling with Pixels
Phillip Rust
Jonas F. Lotz
Emanuele Bugliarello
Elizabeth Salesky
Miryam de Lhoneux
Desmond Elliott
VLM
35
46
0
14 Jul 2022
Text normalization for low-resource languages: the case of Ligurian
S. Lusito
Edoardo Ferrante
Jean Maillard
27
5
0
16 Jun 2022
DziriBERT: a Pre-trained Language Model for the Algerian Dialect
Amine Abdaoui
Mohamed Berrimi
Mourad Oussalah
A. Moussaoui
32
43
0
25 Sep 2021
When Being Unseen from mBERT is just the Beginning: Handling New Languages With Multilingual Language Models
Benjamin Muller
Antonis Anastasopoulos
Benoît Sagot
Djamé Seddah
LRM
126
165
0
24 Oct 2020
CharacterBERT: Reconciling ELMo and BERT for Word-Level Open-Vocabulary Representations From Characters
Hicham El Boukkouri
Olivier Ferret
Thomas Lavergne
Hiroshi Noji
Pierre Zweigenbaum
Junichi Tsujii
71
156
0
20 Oct 2020
1