ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2108.12582
  4. Cited By
Distilling the Knowledge of Large-scale Generative Models into Retrieval
  Models for Efficient Open-domain Conversation

Distilling the Knowledge of Large-scale Generative Models into Retrieval Models for Efficient Open-domain Conversation

28 August 2021
Beomsu Kim
Seokjun Seo
Seungju Han
Enkhbayar Erdenee
Buru Chang
    RALM
ArXivPDFHTML

Papers citing "Distilling the Knowledge of Large-scale Generative Models into Retrieval Models for Efficient Open-domain Conversation"

3 / 3 papers shown
Title
Measuring and Improving Semantic Diversity of Dialogue Generation
Measuring and Improving Semantic Diversity of Dialogue Generation
Seungju Han
Beomsu Kim
Buru Chang
27
12
0
11 Oct 2022
Understanding and Improving the Exemplar-based Generation for
  Open-domain Conversation
Understanding and Improving the Exemplar-based Generation for Open-domain Conversation
Seungju Han
Beomsu Kim
Seokjun Seo
Enkhbayar Erdenee
Buru Chang
36
3
0
13 Dec 2021
Data Augmentation using Pre-trained Transformer Models
Data Augmentation using Pre-trained Transformer Models
Varun Kumar
Ashutosh Choudhary
Eunah Cho
VLM
216
315
0
04 Mar 2020
1