ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2009.09417
  4. Cited By
F^2-Softmax: Diversifying Neural Text Generation via Frequency
  Factorized Softmax

F^2-Softmax: Diversifying Neural Text Generation via Frequency Factorized Softmax

20 September 2020
Byung-Ju Choi
Jimin Hong
D. Park
Sang Wan Lee
ArXivPDFHTML

Papers citing "F^2-Softmax: Diversifying Neural Text Generation via Frequency Factorized Softmax"

6 / 6 papers shown
Title
Partially Randomizing Transformer Weights for Dialogue Response
  Diversity
Partially Randomizing Transformer Weights for Dialogue Response Diversity
Jing Yang Lee
Kong Aik Lee
Woon-Seng Gan
25
0
0
18 Nov 2023
Learning to Diversify Neural Text Generation via Degenerative Model
Learning to Diversify Neural Text Generation via Degenerative Model
Jimin Hong
chaeHun Park
Jaegul Choo
31
0
0
22 Sep 2023
Why Exposure Bias Matters: An Imitation Learning Perspective of Error
  Accumulation in Language Generation
Why Exposure Bias Matters: An Imitation Learning Perspective of Error Accumulation in Language Generation
Kushal Arora
Layla El Asri
Hareesh Bahuleyan
Jackie C.K. Cheung
31
79
0
03 Apr 2022
Rethinking and Refining the Distinct Metric
Rethinking and Refining the Distinct Metric
Siyang Liu
Sahand Sabour
Yinhe Zheng
Pei Ke
Xiaoyan Zhu
Minlie Huang
36
10
0
28 Feb 2022
Diversifying Dialog Generation via Adaptive Label Smoothing
Diversifying Dialog Generation via Adaptive Label Smoothing
Yida Wang
Yinhe Zheng
Yong-jia Jiang
Minlie Huang
28
37
0
30 May 2021
Focus Attention: Promoting Faithfulness and Diversity in Summarization
Focus Attention: Promoting Faithfulness and Diversity in Summarization
Rahul Aralikatte
Shashi Narayan
Joshua Maynez
S. Rothe
Ryan T. McDonald
35
45
0
25 May 2021
1