ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1710.09805
  4. Cited By
Improving Negative Sampling for Word Representation using Self-embedded
  Features

Improving Negative Sampling for Word Representation using Self-embedded Features

26 October 2017
Long Chen
Fajie Yuan
J. Jose
Weinan Zhang
    SSL
ArXivPDFHTML

Papers citing "Improving Negative Sampling for Word Representation using Self-embedded Features"

4 / 4 papers shown
Title
Classification Benchmarks for Under-resourced Bengali Language based on
  Multichannel Convolutional-LSTM Network
Classification Benchmarks for Under-resourced Bengali Language based on Multichannel Convolutional-LSTM Network
Md. Rezaul Karim
Bharathi Raja Chakravarthi
John P. Mccrae
Michael Cochez
11
67
0
11 Apr 2020
Extreme Classification via Adversarial Softmax Approximation
Extreme Classification via Adversarial Softmax Approximation
Robert Bamler
Stephan Mandt
30
23
0
15 Feb 2020
Relaxed Softmax for learning from Positive and Unlabeled data
Relaxed Softmax for learning from Positive and Unlabeled data
Ugo Tanielian
Flavian Vasile
18
9
0
17 Sep 2019
From Frequency to Meaning: Vector Space Models of Semantics
From Frequency to Meaning: Vector Space Models of Semantics
Peter D. Turney
Patrick Pantel
110
2,982
0
04 Mar 2010
1