Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2109.09237
Cited By
MirrorWiC: On Eliciting Word-in-Context Representations from Pretrained Language Models
19 September 2021
Qianchu Liu
Fangyu Liu
Nigel Collier
Anna Korhonen
Ivan Vulić
Re-assign community
ArXiv
PDF
HTML
Papers citing
"MirrorWiC: On Eliciting Word-in-Context Representations from Pretrained Language Models"
5 / 5 papers shown
Title
Injecting Wiktionary to improve token-level contextual representations using contrastive learning
Anna Mosolova
Marie Candito
Carlos Ramisch
26
0
0
12 Feb 2024
Labels Need Prompts Too: Mask Matching for Natural Language Understanding Tasks
Bo Li
Wei Ye
Quan-ding Wang
Wen Zhao
Shikun Zhang
VLM
35
1
0
14 Dec 2023
EnCore: Fine-Grained Entity Typing by Pre-Training Entity Encoders on Coreference Chains
Frank Mtumbuka
Steven Schockaert
42
0
0
22 May 2023
Reranking Overgenerated Responses for End-to-End Task-Oriented Dialogue Systems
Songbo Hu
Ivan Vulić
Fangyu Liu
Anna Korhonen
39
0
0
07 Nov 2022
Let's Play Mono-Poly: BERT Can Reveal Words' Polysemy Level and Partitionability into Senses
Aina Garí Soler
Marianna Apidianaki
MILM
206
68
0
29 Apr 2021
1