ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2108.03502
  4. Cited By
Fine-tuning GPT-3 for Russian Text Summarization

Fine-tuning GPT-3 for Russian Text Summarization

7 August 2021
Alexandr Nikolich
Arina Puchkova
ArXivPDFHTML

Papers citing "Fine-tuning GPT-3 for Russian Text Summarization"

12 / 12 papers shown
Title
ProdRev: A DNN framework for empowering customers using generative pre-trained transformers
ProdRev: A DNN framework for empowering customers using generative pre-trained transformers
Aakash Gupta
Nataraj Das
49
1
0
14 May 2025
Automatic Speech Summarisation: A Scoping Review
Automatic Speech Summarisation: A Scoping Review
Dana Rezazadegan
S. Berkovsky
J. Quiroz
A. Kocaballi
Ying Wang
L. Laranjo
E. Coiera
26
14
0
27 Aug 2020
Advances of Transformer-Based Models for News Headline Generation
Advances of Transformer-Based Models for News Headline Generation
Alexey Bukhtiyarov
I. Gusev
VLM
30
21
0
09 Jul 2020
Dataset for Automatic Summarization of Russian News
Dataset for Automatic Summarization of Russian News
I. Gusev
34
24
0
19 Jun 2020
Automatic Text Summarization of COVID-19 Medical Research Articles using
  BERT and GPT-2
Automatic Text Summarization of COVID-19 Medical Research Articles using BERT and GPT-2
V. Kieuvongngam
Bowen Tan
Yiming Niu
AI4MH
39
96
0
03 Jun 2020
PEGASUS: Pre-training with Extracted Gap-sentences for Abstractive
  Summarization
PEGASUS: Pre-training with Extracted Gap-sentences for Abstractive Summarization
Jingqing Zhang
Yao-Min Zhao
Mohammad Saleh
Peter J. Liu
RALM
3DGS
225
2,029
0
18 Dec 2019
BART: Denoising Sequence-to-Sequence Pre-training for Natural Language
  Generation, Translation, and Comprehension
BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension
M. Lewis
Yinhan Liu
Naman Goyal
Marjan Ghazvininejad
Abdel-rahman Mohamed
Omer Levy
Veselin Stoyanov
Luke Zettlemoyer
AIMat
VLM
184
10,720
0
29 Oct 2019
Exploring the Limits of Transfer Learning with a Unified Text-to-Text
  Transformer
Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer
Colin Raffel
Noam M. Shazeer
Adam Roberts
Katherine Lee
Sharan Narang
Michael Matena
Yanqi Zhou
Wei Li
Peter J. Liu
AIMat
337
19,824
0
23 Oct 2019
BERTScore: Evaluating Text Generation with BERT
BERTScore: Evaluating Text Generation with BERT
Tianyi Zhang
Varsha Kishore
Felix Wu
Kilian Q. Weinberger
Yoav Artzi
245
5,668
0
21 Apr 2019
Self-Attentive Model for Headline Generation
Self-Attentive Model for Headline Generation
Daniil Gavrilov
Pavel Kalaidin
Valentin Malykh
LRM
31
54
0
23 Jan 2019
Attention Is All You Need
Attention Is All You Need
Ashish Vaswani
Noam M. Shazeer
Niki Parmar
Jakob Uszkoreit
Llion Jones
Aidan Gomez
Lukasz Kaiser
Illia Polosukhin
3DV
526
129,831
0
12 Jun 2017
A Neural Attention Model for Abstractive Sentence Summarization
A Neural Attention Model for Abstractive Sentence Summarization
Alexander M. Rush
S. Chopra
Jason Weston
CVBM
133
2,695
0
02 Sep 2015
1