ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2011.03877
  4. Cited By
Best Practices for Data-Efficient Modeling in NLG:How to Train
  Production-Ready Neural Models with Less Data

Best Practices for Data-Efficient Modeling in NLG:How to Train Production-Ready Neural Models with Less Data

8 November 2020
A. Arun
Soumya Batra
Vikas Bhardwaj
Ashwini Challa
Pinar E. Donmez
Peyman Heidari
Hakan Inan
Shashank Jain
Anuj Kumar
Shawn Mei
Karthika Mohan
Michael White
ArXivPDFHTML

Papers citing "Best Practices for Data-Efficient Modeling in NLG:How to Train Production-Ready Neural Models with Less Data"

18 / 18 papers shown
Title
Few-shot Natural Language Generation for Task-Oriented Dialog
Few-shot Natural Language Generation for Task-Oriented Dialog
Baolin Peng
Chenguang Zhu
Chunyuan Li
Xiujun Li
Jinchao Li
Michael Zeng
Jianfeng Gao
35
198
0
27 Feb 2020
Multilingual Denoising Pre-training for Neural Machine Translation
Multilingual Denoising Pre-training for Neural Machine Translation
Yinhan Liu
Jiatao Gu
Naman Goyal
Xian Li
Sergey Edunov
Marjan Ghazvininejad
M. Lewis
Luke Zettlemoyer
AI4CE
AIMat
92
1,786
0
22 Jan 2020
A Good Sample is Hard to Find: Noise Injection Sampling and
  Self-Training for Neural Language Generation Models
A Good Sample is Hard to Find: Noise Injection Sampling and Self-Training for Neural Language Generation Models
Chris Kedzie
Kathleen McKeown
26
36
0
08 Nov 2019
BART: Denoising Sequence-to-Sequence Pre-training for Natural Language
  Generation, Translation, and Comprehension
BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension
M. Lewis
Yinhan Liu
Naman Goyal
Marjan Ghazvininejad
Abdel-rahman Mohamed
Omer Levy
Veselin Stoyanov
Luke Zettlemoyer
AIMat
VLM
81
10,720
0
29 Oct 2019
Revisiting Self-Training for Neural Sequence Generation
Revisiting Self-Training for Neural Sequence Generation
Junxian He
Jiatao Gu
Jiajun Shen
MarcÁurelio Ranzato
SSL
LRM
256
272
0
30 Sep 2019
Constrained Decoding for Neural NLG from Compositional Representations
  in Task-Oriented Dialogue
Constrained Decoding for Neural NLG from Compositional Representations in Task-Oriented Dialogue
Anusha Balakrishnan
J. Rao
Kartikeya Upasani
Michael White
R. Subba
110
82
0
17 Jun 2019
Few-Shot NLG with Pre-Trained Language Model
Few-Shot NLG with Pre-Trained Language Model
Zhiyu Zoey Chen
H. Eavani
Wenhu Chen
Yinyin Liu
William Yang Wang
LMTD
35
141
0
21 Apr 2019
MultiWOZ -- A Large-Scale Multi-Domain Wizard-of-Oz Dataset for
  Task-Oriented Dialogue Modelling
MultiWOZ -- A Large-Scale Multi-Domain Wizard-of-Oz Dataset for Task-Oriented Dialogue Modelling
Paweł Budzianowski
Tsung-Hsien Wen
Bo-Hsiang Tseng
I. Casanueva
Stefan Ultes
Osman Ramadan
Milica Gasic
108
1,306
0
29 Sep 2018
The E2E Dataset: New Challenges For End-to-End Generation
The E2E Dataset: New Challenges For End-to-End Generation
Jekaterina Novikova
Ondrej Dusek
Verena Rieser
64
456
0
28 Jun 2017
Survey of the State of the Art in Natural Language Generation: Core
  tasks, applications and evaluation
Survey of the State of the Art in Natural Language Generation: Core tasks, applications and evaluation
Albert Gatt
E. Krahmer
LM&MA
ELM
57
814
0
29 Mar 2017
Sequence-Level Knowledge Distillation
Sequence-Level Knowledge Distillation
Yoon Kim
Alexander M. Rush
63
1,109
0
25 Jun 2016
Sequence-to-Sequence Generation for Spoken Dialogue via Deep Syntax
  Trees and Strings
Sequence-to-Sequence Generation for Spoken Dialogue via Deep Syntax Trees and Strings
Ondrej Dusek
Filip Jurcícek
33
187
0
17 Jun 2016
Multi-domain Neural Network Language Generation for Spoken Dialogue
  Systems
Multi-domain Neural Network Language Generation for Spoken Dialogue Systems
Tsung-Hsien Wen
Milica Gasic
N. Mrksic
L. Rojas-Barahona
Pei-hao Su
David Vandyke
S. Young
28
188
0
03 Mar 2016
Semantically Conditioned LSTM-based Natural Language Generation for
  Spoken Dialogue Systems
Semantically Conditioned LSTM-based Natural Language Generation for Spoken Dialogue Systems
Tsung-Hsien Wen
Milica Gasic
N. Mrksic
Pei-hao Su
David Vandyke
S. Young
61
948
0
07 Aug 2015
Adam: A Method for Stochastic Optimization
Adam: A Method for Stochastic Optimization
Diederik P. Kingma
Jimmy Ba
ODL
316
149,474
0
22 Dec 2014
Sequence to Sequence Learning with Neural Networks
Sequence to Sequence Learning with Neural Networks
Ilya Sutskever
Oriol Vinyals
Quoc V. Le
AIMat
220
20,467
0
10 Sep 2014
Neural Machine Translation by Jointly Learning to Align and Translate
Neural Machine Translation by Jointly Learning to Align and Translate
Dzmitry Bahdanau
Kyunghyun Cho
Yoshua Bengio
AIMat
287
27,205
0
01 Sep 2014
Learning Phrase Representations using RNN Encoder-Decoder for
  Statistical Machine Translation
Learning Phrase Representations using RNN Encoder-Decoder for Statistical Machine Translation
Kyunghyun Cho
B. V. Merrienboer
Çağlar Gülçehre
Dzmitry Bahdanau
Fethi Bougares
Holger Schwenk
Yoshua Bengio
AIMat
420
23,235
0
03 Jun 2014
1