ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2101.11043
  4. Cited By
Deep Subjecthood: Higher-Order Grammatical Features in Multilingual BERT

Deep Subjecthood: Higher-Order Grammatical Features in Multilingual BERT

26 January 2021
Isabel Papadimitriou
Ethan A. Chi
Richard Futrell
Kyle Mahowald
ArXivPDFHTML

Papers citing "Deep Subjecthood: Higher-Order Grammatical Features in Multilingual BERT"

8 / 8 papers shown
Title
LLMs' morphological analyses of complex FST-generated Finnish words
LLMs' morphological analyses of complex FST-generated Finnish words
Anssi Moisio
Mathias Creutz
M. Kurimo
66
1
0
11 Jul 2024
A Method for Studying Semantic Construal in Grammatical Constructions
  with Interpretable Contextual Embedding Spaces
A Method for Studying Semantic Construal in Grammatical Constructions with Interpretable Contextual Embedding Spaces
Gabriella Chronis
Kyle Mahowald
K. Erk
25
8
0
29 May 2023
Cross-Linguistic Syntactic Difference in Multilingual BERT: How Good is
  It and How Does It Affect Transfer?
Cross-Linguistic Syntactic Difference in Multilingual BERT: How Good is It and How Does It Affect Transfer?
Ningyu Xu
Tao Gui
Ruotian Ma
Qi Zhang
Jingting Ye
Menghan Zhang
Xuanjing Huang
59
13
0
21 Dec 2022
What Artificial Neural Networks Can Tell Us About Human Language
  Acquisition
What Artificial Neural Networks Can Tell Us About Human Language Acquisition
Alex Warstadt
Samuel R. Bowman
32
113
0
17 Aug 2022
Insights into Pre-training via Simpler Synthetic Tasks
Insights into Pre-training via Simpler Synthetic Tasks
Yuhuai Wu
Felix Li
Percy Liang
AIMat
45
20
0
21 Jun 2022
SemAttack: Natural Textual Attacks via Different Semantic Spaces
SemAttack: Natural Textual Attacks via Different Semantic Spaces
Wei Ping
Chejian Xu
Xiangyu Liu
Yuk-Kit Cheng
Yue Liu
SILM
AAML
32
52
0
03 May 2022
Pretraining with Artificial Language: Studying Transferable Knowledge in
  Language Models
Pretraining with Artificial Language: Studying Transferable Knowledge in Language Models
Ryokan Ri
Yoshimasa Tsuruoka
39
27
0
19 Mar 2022
Masked Language Modeling and the Distributional Hypothesis: Order Word
  Matters Pre-training for Little
Masked Language Modeling and the Distributional Hypothesis: Order Word Matters Pre-training for Little
Koustuv Sinha
Robin Jia
Dieuwke Hupkes
J. Pineau
Adina Williams
Douwe Kiela
50
246
0
14 Apr 2021
1