ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1811.00436
  4. Cited By
Unsupervised Dual-Cascade Learning with Pseudo-Feedback Distillation for
  Query-based Extractive Summarization

Unsupervised Dual-Cascade Learning with Pseudo-Feedback Distillation for Query-based Extractive Summarization

1 November 2018
Haggai Roitman
Guy Feigenblat
D. Konopnicki
D. Cohen
O. Boni
ArXivPDFHTML

Papers citing "Unsupervised Dual-Cascade Learning with Pseudo-Feedback Distillation for Query-based Extractive Summarization"

1 / 1 papers shown
Title
Domain Adaptation with Pre-trained Transformers for Query Focused
  Abstractive Text Summarization
Domain Adaptation with Pre-trained Transformers for Query Focused Abstractive Text Summarization
Md Tahmid Rahman Laskar
Enamul Hoque
J. Huang
45
45
0
22 Dec 2021
1