ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2204.05188
  4. Cited By
Tokenwise Contrastive Pretraining for Finer Speech-to-BERT Alignment in
  End-to-End Speech-to-Intent Systems
v1v2 (latest)

Tokenwise Contrastive Pretraining for Finer Speech-to-BERT Alignment in End-to-End Speech-to-Intent Systems

11 April 2022
Vishal Sunder
Eric Fosler-Lussier
Samuel Thomas
H. Kuo
Brian Kingsbury
ArXiv (abs)PDFHTML

Papers citing "Tokenwise Contrastive Pretraining for Finer Speech-to-BERT Alignment in End-to-End Speech-to-Intent Systems"

2 / 2 papers shown
Title
Improving End-to-End SLU performance with Prosodic Attention and
  Distillation
Improving End-to-End SLU performance with Prosodic Attention and Distillation
Shangeth Rajaa
76
4
0
14 May 2023
Understanding Shared Speech-Text Representations
Understanding Shared Speech-Text Representations
Gary Wang
Kyle Kastner
Ankur Bapna
Zhehuai Chen
Andrew Rosenberg
Bhuvana Ramabhadran
Yu Zhang
AuLLM
98
7
0
27 Apr 2023
1