ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2004.02105
  4. Cited By
Unsupervised Domain Clusters in Pretrained Language Models

Unsupervised Domain Clusters in Pretrained Language Models

5 April 2020
Roee Aharoni
Yoav Goldberg
ArXivPDFHTML

Papers citing "Unsupervised Domain Clusters in Pretrained Language Models"

13 / 63 papers shown
Title
CiteWorth: Cite-Worthiness Detection for Improved Scientific Document
  Understanding
CiteWorth: Cite-Worthiness Detection for Improved Scientific Document Understanding
Dustin Wright
Isabelle Augenstein
8
24
0
23 May 2021
"Average" Approximates "First Principal Component"? An Empirical
  Analysis on Representations from Neural Language Models
"Average" Approximates "First Principal Component"? An Empirical Analysis on Representations from Neural Language Models
Zihan Wang
Chengyu Dong
Jingbo Shang
FAtt
34
4
0
18 Apr 2021
An Interpretability Illusion for BERT
An Interpretability Illusion for BERT
Tolga Bolukbasi
Adam Pearce
Ann Yuan
Andy Coenen
Emily Reif
Fernanda Viégas
Martin Wattenberg
MILM
FAtt
32
68
0
14 Apr 2021
Domain Adaptation and Multi-Domain Adaptation for Neural Machine
  Translation: A Survey
Domain Adaptation and Multi-Domain Adaptation for Neural Machine Translation: A Survey
Danielle Saunders
AI4CE
19
85
0
14 Apr 2021
Semantic maps and metrics for science Semantic maps and metrics for
  science using deep transformer encoders
Semantic maps and metrics for science Semantic maps and metrics for science using deep transformer encoders
Brendan Chambers
James A. Evans
MedIm
13
0
0
13 Apr 2021
DirectProbe: Studying Representations without Classifiers
DirectProbe: Studying Representations without Classifiers
Yichu Zhou
Vivek Srikumar
32
27
0
13 Apr 2021
First Align, then Predict: Understanding the Cross-Lingual Ability of
  Multilingual BERT
First Align, then Predict: Understanding the Cross-Lingual Ability of Multilingual BERT
Benjamin Muller
Yanai Elazar
Benoît Sagot
Djamé Seddah
LRM
26
71
0
26 Jan 2021
Nearest Neighbor Machine Translation
Nearest Neighbor Machine Translation
Urvashi Khandelwal
Angela Fan
Dan Jurafsky
Luke Zettlemoyer
M. Lewis
RALM
18
280
0
01 Oct 2020
MISA: Modality-Invariant and -Specific Representations for Multimodal
  Sentiment Analysis
MISA: Modality-Invariant and -Specific Representations for Multimodal Sentiment Analysis
Devamanyu Hazarika
Roger Zimmermann
Soujanya Poria
21
669
0
07 May 2020
Train No Evil: Selective Masking for Task-Guided Pre-Training
Train No Evil: Selective Masking for Task-Guided Pre-Training
Yuxian Gu
Zhengyan Zhang
Xiaozhi Wang
Zhiyuan Liu
Maosong Sun
24
59
0
21 Apr 2020
Split and Rephrase: Better Evaluation and a Stronger Baseline
Split and Rephrase: Better Evaluation and a Stronger Baseline
Roee Aharoni
Yoav Goldberg
MoE
226
45
0
02 May 2018
GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language
  Understanding
GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language Understanding
Alex Jinpeng Wang
Amanpreet Singh
Julian Michael
Felix Hill
Omer Levy
Samuel R. Bowman
ELM
297
6,959
0
20 Apr 2018
Six Challenges for Neural Machine Translation
Six Challenges for Neural Machine Translation
Philipp Koehn
Rebecca Knowles
AAML
AIMat
224
1,208
0
12 Jun 2017
Previous
12