Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2208.09770
Cited By
Z-Code++: A Pre-trained Language Model Optimized for Abstractive Summarization
21 August 2022
Pengcheng He
Baolin Peng
Liyang Lu
Song Wang
Jie Mei
Yang Liu
Ruochen Xu
Hany Awadalla
Yu Shi
Chenguang Zhu
Wayne Xiong
Michael Zeng
Jianfeng Gao
Xuedong Huang
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Z-Code++: A Pre-trained Language Model Optimized for Abstractive Summarization"
4 / 54 papers shown
Title
Get To The Point: Summarization with Pointer-Generator Networks
A. See
Peter J. Liu
Christopher D. Manning
3DPC
276
4,019
0
14 Apr 2017
SQuAD: 100,000+ Questions for Machine Comprehension of Text
Pranav Rajpurkar
Jian Zhang
Konstantin Lopyrev
Percy Liang
RALM
261
8,124
0
16 Jun 2016
Abstractive Text Summarization Using Sequence-to-Sequence RNNs and Beyond
Ramesh Nallapati
Bowen Zhou
Cicero Nogueira dos Santos
Çağlar Gülçehre
Bing Xiang
AIMat
250
2,554
0
19 Feb 2016
A Neural Attention Model for Abstractive Sentence Summarization
Alexander M. Rush
S. Chopra
Jason Weston
CVBM
182
2,700
0
02 Sep 2015
Previous
1
2