ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1606.00819
  4. Cited By
Matrix Factorization using Window Sampling and Negative Sampling for
  Improved Word Representations

Matrix Factorization using Window Sampling and Negative Sampling for Improved Word Representations

2 June 2016
Alexandre Salle
M. Idiart
Aline Villavicencio
ArXivPDFHTML

Papers citing "Matrix Factorization using Window Sampling and Negative Sampling for Improved Word Representations"

9 / 9 papers shown
Title
NoPPA: Non-Parametric Pairwise Attention Random Walk Model for Sentence
  Representation
NoPPA: Non-Parametric Pairwise Attention Random Walk Model for Sentence Representation
Xuansheng Wu
Zhiyi Zhao
Ninghao Liu
16
0
0
24 Feb 2023
A Topological Approach to Compare Document Semantics Based on a New
  Variant of Syntactic N-grams
A Topological Approach to Compare Document Semantics Based on a New Variant of Syntactic N-grams
F. Meng
17
0
0
08 Mar 2021
A Common Semantic Space for Monolingual and Cross-Lingual
  Meta-Embeddings
A Common Semantic Space for Monolingual and Cross-Lingual Meta-Embeddings
G. R. Claramunt
Rodrigo Agerri
German Rigau
29
7
0
17 Jan 2020
Why So Down? The Role of Negative (and Positive) Pointwise Mutual
  Information in Distributional Semantics
Why So Down? The Role of Negative (and Positive) Pointwise Mutual Information in Distributional Semantics
Alexandre Salle
Aline Villavicencio
34
8
0
19 Aug 2019
Learning Semantic Representations for Novel Words: Leveraging Both Form
  and Context
Learning Semantic Representations for Novel Words: Leveraging Both Form and Context
Timo Schick
Hinrich Schütze
AI4TS
NAI
35
33
0
09 Nov 2018
Coherence-Aware Neural Topic Modeling
Coherence-Aware Neural Topic Modeling
Ran Ding
Ramesh Nallapati
Bing Xiang
BDL
11
78
0
07 Sep 2018
Experiential, Distributional and Dependency-based Word Embeddings have
  Complementary Roles in Decoding Brain Activity
Experiential, Distributional and Dependency-based Word Embeddings have Complementary Roles in Decoding Brain Activity
Samira Abnar
Rasyan Ahmed
Max Mijnheer
Willem H. Zuidema
23
48
0
25 Nov 2017
Improving Negative Sampling for Word Representation using Self-embedded
  Features
Improving Negative Sampling for Word Representation using Self-embedded Features
Long Chen
Fajie Yuan
J. Jose
Weinan Zhang
SSL
23
43
0
26 Oct 2017
Enhancing the LexVec Distributed Word Representation Model Using
  Positional Contexts and External Memory
Enhancing the LexVec Distributed Word Representation Model Using Positional Contexts and External Memory
Alexandre Salle
M. Idiart
Aline Villavicencio
16
30
0
03 Jun 2016
1