Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2006.00844
Cited By
Distilling Neural Networks for Greener and Faster Dependency Parsing
1 June 2020
Mark Anderson
Carlos Gómez-Rodríguez
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Distilling Neural Networks for Greener and Faster Dependency Parsing"
5 / 5 papers shown
Title
The Fragility of Multi-Treebank Parsing Evaluation
I. Alonso-Alonso
David Vilares
Carlos Gómez-Rodríguez
22
1
0
14 Sep 2022
Not All Linearizations Are Equally Data-Hungry in Sequence Labeling Parsing
Alberto Muñoz-Ortiz
Michalina Strzyz
David Vilares
27
9
0
17 Aug 2021
A Modest Pareto Optimisation Analysis of Dependency Parsers in 2021
Mark Anderson
Carlos Gómez-Rodríguez
41
9
0
08 Jun 2021
Structural Knowledge Distillation: Tractably Distilling Information for Structured Predictor
Xinyu Wang
Yong-jia Jiang
Zhaohui Yan
Zixia Jia
Nguyen Bach
Tao Wang
Zhongqiang Huang
Fei Huang
Kewei Tu
26
10
0
10 Oct 2020
It's Not Just Size That Matters: Small Language Models Are Also Few-Shot Learners
Timo Schick
Hinrich Schütze
51
956
0
15 Sep 2020
1