ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1710.10739
  4. Cited By
Learning neural trans-dimensional random field language models with
  noise-contrastive estimation

Learning neural trans-dimensional random field language models with noise-contrastive estimation

30 October 2017
Bin Wang
Zhijian Ou
ArXivPDFHTML

Papers citing "Learning neural trans-dimensional random field language models with noise-contrastive estimation"

4 / 4 papers shown
Title
Exploring Energy-based Language Models with Different Architectures and
  Training Methods for Speech Recognition
Exploring Energy-based Language Models with Different Architectures and Training Methods for Speech Recognition
Hong Liu
Z. Lv
Zhijian Ou
Wenbo Zhao
Qing Xiao
24
0
0
22 May 2023
RankGen: Improving Text Generation with Large Ranking Models
RankGen: Improving Text Generation with Large Ranking Models
Kalpesh Krishna
Yapei Chang
John Wieting
Mohit Iyyer
AIMat
24
68
0
19 May 2022
Residual Energy-Based Models for Text Generation
Residual Energy-Based Models for Text Generation
Yuntian Deng
A. Bakhtin
Myle Ott
Arthur Szlam
MarcÁurelio Ranzato
20
125
0
22 Apr 2020
A Review of Learning with Deep Generative Models from Perspective of
  Graphical Modeling
A Review of Learning with Deep Generative Models from Perspective of Graphical Modeling
Zhijian Ou
23
16
0
05 Aug 2018
1