Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2109.03160
Cited By
How much pretraining data do language models need to learn syntax?
7 September 2021
Laura Pérez-Mayos
Miguel Ballesteros
Leo Wanner
Re-assign community
ArXiv
PDF
HTML
Papers citing
"How much pretraining data do language models need to learn syntax?"
8 / 8 papers shown
Title
Pre-training LLMs using human-like development data corpus
Khushi Bhardwaj
Raj Sanjay Shah
Sashank Varma
27
6
0
08 Nov 2023
Language-Agnostic Bias Detection in Language Models with Bias Probing
Abdullatif Köksal
Omer F. Yalcin
Ahmet Akbiyik
M. Kilavuz
Anna Korhonen
Hinrich Schütze
35
1
0
22 May 2023
Can We Use Probing to Better Understand Fine-tuning and Knowledge Distillation of the BERT NLU?
Jakub Ho'scilowicz
Marcin Sowanski
Piotr Czubowski
Artur Janicki
25
2
0
27 Jan 2023
SocioProbe: What, When, and Where Language Models Learn about Sociodemographics
Anne Lauscher
Federico Bianchi
Samuel R. Bowman
Dirk Hovy
29
7
0
08 Nov 2022
State-of-the-art generalisation research in NLP: A taxonomy and review
Dieuwke Hupkes
Mario Giulianelli
Verna Dankers
Mikel Artetxe
Yanai Elazar
...
Leila Khalatbari
Maria Ryskina
Rita Frieske
Ryan Cotterell
Zhijing Jin
114
93
0
06 Oct 2022
A computational psycholinguistic evaluation of the syntactic abilities of Galician BERT models at the interface of dependency resolution and training time
Iria de-Dios-Flores
Marcos Garcia
25
2
0
06 Jun 2022
What you can cram into a single vector: Probing sentence embeddings for linguistic properties
Alexis Conneau
Germán Kruszewski
Guillaume Lample
Loïc Barrault
Marco Baroni
201
882
0
03 May 2018
GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language Understanding
Alex Jinpeng Wang
Amanpreet Singh
Julian Michael
Felix Hill
Omer Levy
Samuel R. Bowman
ELM
297
6,959
0
20 Apr 2018
1