ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2106.04302
  4. Cited By
Obtaining Better Static Word Embeddings Using Contextual Embedding
  Models

Obtaining Better Static Word Embeddings Using Contextual Embedding Models

8 June 2021
Prakhar Gupta
Martin Jaggi
ArXivPDFHTML

Papers citing "Obtaining Better Static Word Embeddings Using Contextual Embedding Models"

6 / 6 papers shown
Title
A Comparative Analysis of Static Word Embeddings for Hungarian
A Comparative Analysis of Static Word Embeddings for Hungarian
Máté Gedeon
39
0
0
12 May 2025
A Comprehensive Analysis of Static Word Embeddings for Turkish
A Comprehensive Analysis of Static Word Embeddings for Turkish
Karahan Sarıtaş
Cahid Arda Öz
Tunga Güngör
23
3
0
13 May 2024
Backpack Language Models
Backpack Language Models
John Hewitt
John Thickstun
Christopher D. Manning
Percy Liang
KELM
19
16
0
26 May 2023
Combining Static and Contextualised Multilingual Embeddings
Combining Static and Contextualised Multilingual Embeddings
Katharina Hämmerl
Jindrich Libovický
Alexander Fraser
25
10
0
17 Mar 2022
Word Translation Without Parallel Data
Word Translation Without Parallel Data
Alexis Conneau
Guillaume Lample
MarcÁurelio Ranzato
Ludovic Denoyer
Hervé Jégou
189
1,639
0
11 Oct 2017
Efficient Estimation of Word Representations in Vector Space
Efficient Estimation of Word Representations in Vector Space
Tomáš Mikolov
Kai Chen
G. Corrado
J. Dean
3DV
296
31,267
0
16 Jan 2013
1