Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2309.10931
Cited By
v1
v2
v3 (latest)
A Family of Pretrained Transformer Language Models for Russian
19 September 2023
Dmitry Zmitrovich
Alexander Abramov
Andrey Kalmykov
Maria Tikhonova
Ekaterina Taktasheva
Danil Astafurov
Mark Baushenko
Artem Snegirev
Vitalii Kadulin
Sergey Markov
Tatiana Shavrina
Vladislav Mikhailov
Alena Fenogenova
Re-assign community
ArXiv (abs)
PDF
HTML
Papers citing
"A Family of Pretrained Transformer Language Models for Russian"
6 / 6 papers shown
Title
Methods for Recognizing Nested Terms
I. Rozhkov
Natalia Loukachevitch
110
0
0
22 Apr 2025
REPA: Russian Error Types Annotation for Evaluating Text Generation and Judgment Capabilities
Alexander Pugachev
Alena Fenogenova
Vladislav Mikhailov
Ekaterina Artemova
109
0
0
17 Mar 2025
Anything Goes? A Crosslinguistic Study of (Im)possible Language Learning in LMs
Xiulin Yang
Tatsuya Aoyama
Yuekun Yao
Ethan Wilcox
116
1
0
26 Feb 2025
The Russian-focused embedders' exploration: ruMTEB benchmark and Russian embedding model design
Artem Snegirev
Maria Tikhonova
Anna Maksimova
Alena Fenogenova
Alexander Abramov
217
6
0
22 Aug 2024
mGPT: Few-Shot Learners Go Multilingual
Oleh Shliazhko
Alena Fenogenova
Maria Tikhonova
Vladislav Mikhailov
Anastasia Kozlova
Tatiana Shavrina
131
155
0
15 Apr 2022
PhoBERT: Pre-trained language models for Vietnamese
Dat Quoc Nguyen
A. Nguyen
273
357
0
02 Mar 2020
1