Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2005.04511
Cited By
Finding Universal Grammatical Relations in Multilingual BERT
9 May 2020
Ethan A. Chi
John Hewitt
Christopher D. Manning
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Finding Universal Grammatical Relations in Multilingual BERT"
37 / 37 papers shown
Title
Do we really have to filter out random noise in pre-training data for language models?
Jinghan Ru
Yuxin Xie
Xianwei Zhuang
Yuguo Yin
Zhihui Guo
Zhiming Liu
Qianli Ren
Yuexian Zou
83
2
0
10 Feb 2025
Large Language Models Share Representations of Latent Grammatical Concepts Across Typologically Diverse Languages
Jannik Brinkmann
Chris Wendler
Christian Bartelt
Aaron Mueller
51
9
0
10 Jan 2025
Exploring Multilingual Large Language Models for Enhanced TNM classification of Radiology Report in lung cancer staging
Hidetoshi Matsuo
Mizuho Nishio
Takaaki Matsunaga
Koji Fujimoto
Takamichi Murakami
LM&MA
44
5
0
05 Jun 2024
Tracing the Roots of Facts in Multilingual Language Models: Independent, Shared, and Transferred Knowledge
Xin Zhao
Naoki Yoshinaga
Daisuke Oba
KELM
HILM
34
10
0
08 Mar 2024
Structural Priming Demonstrates Abstract Grammatical Representations in Multilingual Language Models
J. Michaelov
Catherine Arnett
Tyler A. Chang
Benjamin Bergen
36
12
0
15 Nov 2023
Investigating semantic subspaces of Transformer sentence embeddings through linear structural probing
Dmitry Nikolaev
Sebastian Padó
46
5
0
18 Oct 2023
Morphosyntactic probing of multilingual BERT models
Judit Ács
Endre Hamerlik
Roy Schwartz
Noah A. Smith
András Kornai
29
9
0
09 Jun 2023
How do languages influence each other? Studying cross-lingual data sharing during LM fine-tuning
Rochelle Choenni
Dan Garrette
Ekaterina Shutova
34
15
0
22 May 2023
Discovering Universal Geometry in Embeddings with ICA
Hiroaki Yamagiwa
Momose Oyama
Hidetoshi Shimodaira
25
13
0
22 May 2023
Language Embeddings Sometimes Contain Typological Generalizations
Robert Östling
Murathan Kurfali
NAI
24
9
0
19 Jan 2023
SensePOLAR: Word sense aware interpretability for pre-trained contextual word embeddings
Jan Engler
Sandipan Sikdar
Marlene Lutz
M. Strohmaier
24
7
0
11 Jan 2023
On the Effect of Pre-training for Transformer in Different Modality on Offline Reinforcement Learning
S. Takagi
OffRL
18
7
0
17 Nov 2022
SocioProbe: What, When, and Where Language Models Learn about Sociodemographics
Anne Lauscher
Federico Bianchi
Samuel R. Bowman
Dirk Hovy
27
7
0
08 Nov 2022
Data-Efficient Cross-Lingual Transfer with Language-Specific Subnetworks
Rochelle Choenni
Dan Garrette
Ekaterina Shutova
24
2
0
31 Oct 2022
Universal and Independent: Multilingual Probing Framework for Exhaustive Model Interpretation and Evaluation
O. Serikov
Vitaly Protasov
E. Voloshina
V. Knyazkova
Tatiana Shavrina
32
3
0
24 Oct 2022
Revisiting the Practical Effectiveness of Constituency Parse Extraction from Pre-trained Language Models
Taeuk Kim
37
1
0
15 Sep 2022
Insights into Pre-training via Simpler Synthetic Tasks
Yuhuai Wu
Felix Li
Percy Liang
AIMat
26
20
0
21 Jun 2022
Word-order typology in Multilingual BERT: A case study in subordinate-clause detection
Dmitry Nikolaev
Sebastian Padó
27
6
0
24 May 2022
The Geometry of Multilingual Language Model Representations
Tyler A. Chang
Z. Tu
Benjamin Bergen
16
56
0
22 May 2022
Analyzing Gender Representation in Multilingual Models
Hila Gonen
Shauli Ravfogel
Yoav Goldberg
23
11
0
20 Apr 2022
Probing for Labeled Dependency Trees
Max Müller-Eberstein
Rob van der Goot
Barbara Plank
13
7
0
24 Mar 2022
Pretraining with Artificial Language: Studying Transferable Knowledge in Language Models
Ryokan Ri
Yoshimasa Tsuruoka
32
25
0
19 Mar 2022
Coloring the Blank Slate: Pre-training Imparts a Hierarchical Inductive Bias to Sequence-to-sequence Models
Aaron Mueller
Robert Frank
Tal Linzen
Luheng Wang
Sebastian Schuster
AIMat
19
33
0
17 Mar 2022
Micromodels for Efficient, Explainable, and Reusable Systems: A Case Study on Mental Health
Andrew Lee
Jonathan K. Kummerfeld
Lawrence C. An
Rada Mihalcea
38
22
0
28 Sep 2021
Cross-lingual Transfer of Monolingual Models
Evangelia Gogoulou
Ariel Ekgren
T. Isbister
Magnus Sahlgren
29
17
0
15 Sep 2021
Examining Cross-lingual Contextual Embeddings with Orthogonal Structural Probes
Tomasz Limisiewicz
David Marevcek
11
3
0
10 Sep 2021
Codified audio language modeling learns useful representations for music information retrieval
Rodrigo Castellon
Chris Donahue
Percy Liang
78
86
0
12 Jul 2021
A Primer on Pretrained Multilingual Language Models
Sumanth Doddapaneni
Gowtham Ramesh
Mitesh M. Khapra
Anoop Kunchukuttan
Pratyush Kumar
LRM
43
73
0
01 Jul 2021
A multilabel approach to morphosyntactic probing
Naomi Tachikawa Shapiro
Amandalynne Paullada
Shane Steinert-Threlkeld
29
10
0
17 Apr 2021
The Interplay of Variant, Size, and Task Type in Arabic Pre-trained Language Models
Go Inoue
Bashar Alhafni
Nurpeiis Baimukan
Houda Bouamor
Nizar Habash
35
223
0
11 Mar 2021
Measuring and Improving Consistency in Pretrained Language Models
Yanai Elazar
Nora Kassner
Shauli Ravfogel
Abhilasha Ravichander
Eduard H. Hovy
Hinrich Schütze
Yoav Goldberg
HILM
263
346
0
01 Feb 2021
First Align, then Predict: Understanding the Cross-Lingual Ability of Multilingual BERT
Benjamin Muller
Yanai Elazar
Benoît Sagot
Djamé Seddah
LRM
21
71
0
26 Jan 2021
Deep Subjecthood: Higher-Order Grammatical Features in Multilingual BERT
Isabel Papadimitriou
Ethan A. Chi
Richard Futrell
Kyle Mahowald
19
44
0
26 Jan 2021
On the Language Neutrality of Pre-trained Multilingual Representations
Jindrich Libovický
Rudolf Rosa
Alexander M. Fraser
19
99
0
09 Apr 2020
Investigating Multilingual NMT Representations at Scale
Sneha Kudugunta
Ankur Bapna
Isaac Caswell
N. Arivazhagan
Orhan Firat
LRM
144
120
0
05 Sep 2019
What you can cram into a single vector: Probing sentence embeddings for linguistic properties
Alexis Conneau
Germán Kruszewski
Guillaume Lample
Loïc Barrault
Marco Baroni
201
882
0
03 May 2018
Word Translation Without Parallel Data
Alexis Conneau
Guillaume Lample
MarcÁurelio Ranzato
Ludovic Denoyer
Hervé Jégou
174
1,635
0
11 Oct 2017
1