Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2309.11259
Cited By
Sequence-to-Sequence Spanish Pre-trained Language Models
20 September 2023
Vladimir Araujo
Maria Mihaela Truşcǎ
Rodrigo Tufino
Marie-Francine Moens
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Sequence-to-Sequence Spanish Pre-trained Language Models"
8 / 8 papers shown
Title
FairytaleQA Translated: Enabling Educational Question and Answer Generation in Less-Resourced Languages
Bernardo Leite
T. Osório
Henrique Lopes Cardoso
AI4Ed
27
1
0
06 Jun 2024
RigoBERTa: A State-of-the-Art Language Model For Spanish
Alejandro Vaca Serrano
Guillem García Subies
Helena Montoro Zamorano
Nuria Aldama García
Doaa Samy
David Betancur Sánchez
Antonio Moreno-Sandoval
Marta Guerrero Nieto
Á. Jiménez
AILaw
13
13
0
27 Apr 2022
CPT: A Pre-Trained Unbalanced Transformer for Both Chinese Language Understanding and Generation
Yunfan Shao
Zhichao Geng
Yitao Liu
Junqi Dai
Hang Yan
Fei Yang
Li Zhe
Hujun Bao
Xipeng Qiu
MedIm
70
147
0
13 Sep 2021
BiSECT: Learning to Split and Rephrase Sentences with Bitexts
Joongwon Kim
Mounica Maddela
Reno Kriz
Wei-ping Xu
Chris Callison-Burch
56
25
0
10 Sep 2021
AraT5: Text-to-Text Transformers for Arabic Language Generation
El Moatez Billah Nagoudi
AbdelRahim Elmadany
Muhammad Abdul-Mageed
86
118
0
31 Aug 2021
Deduplicating Training Data Makes Language Models Better
Katherine Lee
Daphne Ippolito
A. Nystrom
Chiyuan Zhang
Douglas Eck
Chris Callison-Burch
Nicholas Carlini
SyDa
242
593
0
14 Jul 2021
BARThez: a Skilled Pretrained French Sequence-to-Sequence Model
Moussa Kamal Eddine
A. Tixier
Michalis Vazirgiannis
BDL
103
64
0
23 Oct 2020
MLQA: Evaluating Cross-lingual Extractive Question Answering
Patrick Lewis
Barlas Oğuz
Ruty Rinott
Sebastian Riedel
Holger Schwenk
ELM
246
492
0
16 Oct 2019
1