ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2305.12676
  4. Cited By
Exploring Energy-based Language Models with Different Architectures and
  Training Methods for Speech Recognition
v1v2v3 (latest)

Exploring Energy-based Language Models with Different Architectures and Training Methods for Speech Recognition

22 May 2023
Hong Liu
Z. Lv
Zhijian Ou
Wenbo Zhao
Qing Xiao
ArXiv (abs)PDFHTML

Papers citing "Exploring Energy-based Language Models with Different Architectures and Training Methods for Speech Recognition"

12 / 12 papers shown
Title
Global Normalization for Streaming Speech Recognition in a Modular
  Framework
Global Normalization for Streaming Speech Recognition in a Modular Framework
Ehsan Variani
Ke Wu
Michael Riley
David Rybach
Matt Shannon
Cyril Allauzen
45
9
0
26 May 2022
WenetSpeech: A 10000+ Hours Multi-domain Mandarin Corpus for Speech
  Recognition
WenetSpeech: A 10000+ Hours Multi-domain Mandarin Corpus for Speech Recognition
Binbin Zhang
Hang Lv
Pengcheng Guo
Qijie Shao
Chao Yang
...
Hui Bu
Xiaoyu Chen
Chenchen Zeng
Di Wu
Zhendong Peng
97
231
0
07 Oct 2021
Pre-Training Transformers as Energy-Based Cloze Models
Pre-Training Transformers as Energy-Based Cloze Models
Kevin Clark
Minh-Thang Luong
Quoc V. Le
Christopher D. Manning
64
80
0
15 Dec 2020
Conformer: Convolution-augmented Transformer for Speech Recognition
Conformer: Convolution-augmented Transformer for Speech Recognition
Anmol Gulati
James Qin
Chung-Cheng Chiu
Niki Parmar
Yu Zhang
...
Wei Han
Shibo Wang
Zhengdong Zhang
Yonghui Wu
Ruoming Pang
229
3,160
0
16 May 2020
Integrating Discrete and Neural Features via Mixed-feature
  Trans-dimensional Random Field Language Models
Integrating Discrete and Neural Features via Mixed-feature Trans-dimensional Random Field Language Models
Silin Gao
Zhijian Ou
Wei Yang
Huifang Xu
14
2
0
14 Feb 2020
Your Classifier is Secretly an Energy Based Model and You Should Treat
  it Like One
Your Classifier is Secretly an Energy Based Model and You Should Treat it Like One
Will Grathwohl
Kuan-Chieh Wang
J. Jacobsen
David Duvenaud
Mohammad Norouzi
Kevin Swersky
VLM
93
546
0
06 Dec 2019
Global Autoregressive Models for Data-Efficient Sequence Learning
Global Autoregressive Models for Data-Efficient Sequence Learning
Tetiana Parshakova
J. Andreoli
Marc Dymetman
50
23
0
16 Sep 2019
BERT has a Mouth, and It Must Speak: BERT as a Markov Random Field
  Language Model
BERT has a Mouth, and It Must Speak: BERT as a Markov Random Field Language Model
Alex Jinpeng Wang
Kyunghyun Cho
VLM
103
358
0
11 Feb 2019
Language modeling with Neural trans-dimensional random fields
Language modeling with Neural trans-dimensional random fields
Bin Wang
Zhijian Ou
58
17
0
23 Jul 2017
Globally Normalized Transition-Based Neural Networks
Globally Normalized Transition-Based Neural Networks
D. Andor
Chris Alberti
David J. Weiss
Aliaksei Severyn
Alessandro Presta
Kuzman Ganchev
Slav Petrov
Michael Collins
85
568
0
19 Mar 2016
Sequence Level Training with Recurrent Neural Networks
Sequence Level Training with Recurrent Neural Networks
MarcÁurelio Ranzato
S. Chopra
Michael Auli
Wojciech Zaremba
108
1,620
0
20 Nov 2015
Sequence Transduction with Recurrent Neural Networks
Sequence Transduction with Recurrent Neural Networks
Alex Graves
195
1,872
0
14 Nov 2012
1