ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2211.15199
  4. Cited By
Large Pre-Trained Models with Extra-Large Vocabularies: A Contrastive
  Analysis of Hebrew BERT Models and a New One to Outperform Them All

Large Pre-Trained Models with Extra-Large Vocabularies: A Contrastive Analysis of Hebrew BERT Models and a New One to Outperform Them All

28 November 2022
Eylon Guetta
Avi Shmidman
Shaltiel Shmidman
C. Shmidman
Joshua Guedalia
Moshe Koppel
Dan Bareket
Amit Seker
Reut Tsarfaty
    VLM
ArXivPDFHTML

Papers citing "Large Pre-Trained Models with Extra-Large Vocabularies: A Contrastive Analysis of Hebrew BERT Models and a New One to Outperform Them All"

3 / 3 papers shown
Title
Explicit Morphological Knowledge Improves Pre-training of Language
  Models for Hebrew
Explicit Morphological Knowledge Improves Pre-training of Language Models for Hebrew
Eylon Gueta
Omer Goldman
Reut Tsarfaty
18
1
0
01 Nov 2023
DictaBERT: A State-of-the-Art BERT Suite for Modern Hebrew
DictaBERT: A State-of-the-Art BERT Suite for Modern Hebrew
Shaltiel Shmidman
Avi Shmidman
Moshe Koppel
27
7
0
31 Aug 2023
Multilingual Sequence-to-Sequence Models for Hebrew NLP
Multilingual Sequence-to-Sequence Models for Hebrew NLP
Matan Eyal
Hila Noga
Roee Aharoni
Idan Szpektor
Reut Tsarfaty
36
4
0
19 Dec 2022
1