Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2106.06519
Cited By
N-Best ASR Transformer: Enhancing SLU Performance using Multiple ASR Hypotheses
11 June 2021
Karthik Ganesan
P. Bamdev
Jaivarsan B
Amresh Venugopal
A. Tushar
Re-assign community
ArXiv
PDF
HTML
Papers citing
"N-Best ASR Transformer: Enhancing SLU Performance using Multiple ASR Hypotheses"
8 / 8 papers shown
Title
ASR Error Correction using Large Language Models
Rao Ma
Mengjie Qian
Mark Gales
Kate Knill
KELM
92
4
0
14 Sep 2024
Adapting Pretrained Transformer to Lattices for Spoken Language Understanding
Chao-Wei Huang
Yun-Nung Chen
35
37
0
02 Nov 2020
Jointly Encoding Word Confusion Network and Dialogue Context with BERT for Spoken Language Understanding
Chen Liu
Su Zhu
Zijian Zhao
Ruisheng Cao
Lu Chen
Kai Yu
68
19
0
24 May 2020
Unsupervised Cross-lingual Representation Learning at Scale
Alexis Conneau
Kartikay Khandelwal
Naman Goyal
Vishrav Chaudhary
Guillaume Wenzek
Francisco Guzmán
Edouard Grave
Myle Ott
Luke Zettlemoyer
Veselin Stoyanov
212
6,555
0
05 Nov 2019
A Hierarchical Decoding Model For Spoken Language Understanding From Unaligned Data
Zijian Zhao
Su Zhu
Kai Yu
29
16
0
09 Apr 2019
BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
Jacob Devlin
Ming-Wei Chang
Kenton Lee
Kristina Toutanova
VLM
SSL
SSeg
1.8K
94,770
0
11 Oct 2018
Encoding Word Confusion Networks with Recurrent Neural Networks for Dialog State Tracking
Glorianna Jagfeld
Ngoc Thang Vu
38
13
0
18 Jul 2017
Exploiting Sentence and Context Representations in Deep Neural Models for Spoken Language Understanding
L. Rojas-Barahona
Milica Gasic
N. Mrksic
Pei-hao Su
Stefan Ultes
Tsung-Hsien Wen
S. Young
47
27
0
13 Oct 2016
1