ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1711.11027
  4. Cited By
Embedding Words as Distributions with a Bayesian Skip-gram Model

Embedding Words as Distributions with a Bayesian Skip-gram Model

29 November 2017
Arthur Brazinskas
Serhii Havrylov
Ivan Titov
    BDL
ArXivPDFHTML

Papers citing "Embedding Words as Distributions with a Bayesian Skip-gram Model"

4 / 4 papers shown
Title
Bayesian Neural Word Embedding
Bayesian Neural Word Embedding
Oren Barkan
BDL
255
87
0
21 Mar 2016
Do Multi-Sense Embeddings Improve Natural Language Understanding?
Do Multi-Sense Embeddings Improve Natural Language Understanding?
Jiwei Li
Dan Jurafsky
60
234
0
02 Jun 2015
One Billion Word Benchmark for Measuring Progress in Statistical
  Language Modeling
One Billion Word Benchmark for Measuring Progress in Statistical Language Modeling
Ciprian Chelba
Tomas Mikolov
M. Schuster
Qi Ge
T. Brants
P. Koehn
T. Robinson
171
1,103
0
11 Dec 2013
Natural Language Processing (almost) from Scratch
Natural Language Processing (almost) from Scratch
R. Collobert
Jason Weston
Léon Bottou
Michael Karlen
Koray Kavukcuoglu
Pavel P. Kuksa
179
7,725
0
02 Mar 2011
1