ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2110.04257
  4. Cited By
VieSum: How Robust Are Transformer-based Models on Vietnamese
  Summarization?

VieSum: How Robust Are Transformer-based Models on Vietnamese Summarization?

8 October 2021
Hieu Duy Nguyen
Long Phan
J. Anibal
Alec Peltekian
H. Tran
ArXivPDFHTML

Papers citing "VieSum: How Robust Are Transformer-based Models on Vietnamese Summarization?"

4 / 4 papers shown
Title
ViT5: Pretrained Text-to-Text Transformer for Vietnamese Language
  Generation
ViT5: Pretrained Text-to-Text Transformer for Vietnamese Language Generation
Long Phan
H. Tran
Hieu Duy Nguyen
Trieu H. Trinh
ViT
41
63
0
13 May 2022
BARTpho: Pre-trained Sequence-to-Sequence Models for Vietnamese
BARTpho: Pre-trained Sequence-to-Sequence Models for Vietnamese
Nguyen Luong Tran
Duong Minh Le
Dat Quoc Nguyen
19
52
0
20 Sep 2021
PhoBERT: Pre-trained language models for Vietnamese
PhoBERT: Pre-trained language models for Vietnamese
Dat Quoc Nguyen
A. Nguyen
174
343
0
02 Mar 2020
Text Summarization with Pretrained Encoders
Text Summarization with Pretrained Encoders
Yang Liu
Mirella Lapata
MILM
261
1,436
0
22 Aug 2019
1