Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2304.01282
Cited By
PEACH: Pre-Training Sequence-to-Sequence Multilingual Models for Translation with Semi-Supervised Pseudo-Parallel Document Generation
3 April 2023
Alireza Salemi
Amirhossein Abaskohi
Sara Tavakoli
Yadollah Yaghoobzadeh
A. Shakery
AIMat
Re-assign community
ArXiv
PDF
HTML
Papers citing
"PEACH: Pre-Training Sequence-to-Sequence Multilingual Models for Translation with Semi-Supervised Pseudo-Parallel Document Generation"
4 / 4 papers shown
Title
ARMAN: Pre-training with Semantically Selecting and Reordering of Sentences for Persian Abstractive Summarization
Alireza Salemi
Emad Kebriaei
Ghazal Neisi Minaei
A. Shakery
CVBM
15
5
0
09 Sep 2021
MT6: Multilingual Pretrained Text-to-Text Transformer with Translation Pairs
Zewen Chi
Li Dong
Shuming Ma
Shaohan Huang Xian-Ling Mao
Heyan Huang
Furu Wei
LRM
50
71
0
18 Apr 2021
Word Translation Without Parallel Data
Alexis Conneau
Guillaume Lample
MarcÁurelio Ranzato
Ludovic Denoyer
Hervé Jégou
183
1,635
0
11 Oct 2017
Google's Neural Machine Translation System: Bridging the Gap between Human and Machine Translation
Yonghui Wu
M. Schuster
Z. Chen
Quoc V. Le
Mohammad Norouzi
...
Alex Rudnick
Oriol Vinyals
G. Corrado
Macduff Hughes
J. Dean
AIMat
716
6,746
0
26 Sep 2016
1