ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2104.10507
  4. Cited By
On Sampling-Based Training Criteria for Neural Language Modeling

On Sampling-Based Training Criteria for Neural Language Modeling

21 April 2021
Yingbo Gao
David Thulke
Alexander Gerstenberger
Viet Anh Khoa Tran
Ralf Schluter
Hermann Ney
ArXivPDFHTML

Papers citing "On Sampling-Based Training Criteria for Neural Language Modeling"

3 / 3 papers shown
Title
Self-Normalized Importance Sampling for Neural Language Modeling
Self-Normalized Importance Sampling for Neural Language Modeling
Zijian Yang
Yingbo Gao
Alexander Gerstenberger
Jintao Jiang
Ralf Schluter
Hermann Ney
21
1
0
11 Nov 2021
A Mutual Information Maximization Perspective of Language Representation
  Learning
A Mutual Information Maximization Perspective of Language Representation Learning
Lingpeng Kong
Cyprien de Masson dÁutume
Wang Ling
Lei Yu
Zihang Dai
Dani Yogatama
SSL
226
165
0
18 Oct 2019
Efficient Estimation of Word Representations in Vector Space
Efficient Estimation of Word Representations in Vector Space
Tomáš Mikolov
Kai Chen
G. Corrado
J. Dean
3DV
296
31,267
0
16 Jan 2013
1