ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1602.06023
284
2567
v1v2v3v4v5 (latest)

Abstractive Text Summarization Using Sequence-to-Sequence RNNs and Beyond

19 February 2016
Ramesh Nallapati
Bowen Zhou
Cicero Nogueira dos Santos
Çağlar Gülçehre
Bing Xiang
    AIMat
ArXiv (abs)PDFHTML
Abstract

In this work, we cast abstractive text summarization as a sequence-to-sequence problem and employ the framework of Attentional Encoder-Decoder Recurrent Neural Networks to this problem, outperforming state-of-the art model of Rush et. al. (2015) on two different corpora. We also move beyond the basic architecture, and propose several novel models to address important problems in summarization including modeling key-words, capturing the hierarchy of sentence-to-word structure and addressing the problem of words that are key to a document, but rare elsewhere. Our work shows that many of our proposed solutions contribute to further improvement in performance. In addition, we propose a new dataset consisting of multi-sentence summaries, and establish performance benchmarks for further research.

View on arXiv
Comments on this paper