Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2211.15199
Cited By
Large Pre-Trained Models with Extra-Large Vocabularies: A Contrastive Analysis of Hebrew BERT Models and a New One to Outperform Them All
28 November 2022
Eylon Guetta
Avi Shmidman
Shaltiel Shmidman
C. Shmidman
Joshua Guedalia
Moshe Koppel
Dan Bareket
Amit Seker
Reut Tsarfaty
VLM
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Large Pre-Trained Models with Extra-Large Vocabularies: A Contrastive Analysis of Hebrew BERT Models and a New One to Outperform Them All"
3 / 3 papers shown
Title
Explicit Morphological Knowledge Improves Pre-training of Language Models for Hebrew
Eylon Gueta
Omer Goldman
Reut Tsarfaty
18
1
0
01 Nov 2023
DictaBERT: A State-of-the-Art BERT Suite for Modern Hebrew
Shaltiel Shmidman
Avi Shmidman
Moshe Koppel
27
7
0
31 Aug 2023
Multilingual Sequence-to-Sequence Models for Hebrew NLP
Matan Eyal
Hila Noga
Roee Aharoni
Idan Szpektor
Reut Tsarfaty
36
4
0
19 Dec 2022
1