Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2102.12971
Cited By
Are pre-trained text representations useful for multilingual and multi-dimensional language proficiency modeling?
25 February 2021
Taraka Rama
Sowmya Vajjala
Re-assign community
ArXiv (abs)
PDF
HTML
Github (1★)
Papers citing
"Are pre-trained text representations useful for multilingual and multi-dimensional language proficiency modeling?"
7 / 7 papers shown
Title
A Systematic Analysis of Morphological Content in BERT Models for Multiple Languages
Daniel Edmiston
51
32
0
06 Apr 2020
Automated Essay Scoring based on Two-Stage Learning
Jiawei Liu
Yang Xu
Yaguang Zhu
31
61
0
23 Jan 2019
Massively Multilingual Sentence Embeddings for Zero-Shot Cross-Lingual Transfer and Beyond
Mikel Artetxe
Holger Schwenk
3DV
156
1,017
0
26 Dec 2018
BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
Jacob Devlin
Ming-Wei Chang
Kenton Lee
Kristina Toutanova
VLM
SSL
SSeg
1.8K
95,229
0
11 Oct 2018
Experiments with Universal CEFR Classification
Sowmya Vajjala
Taraka Rama
58
35
0
18 Apr 2018
Neural Multi-task Learning in Automated Assessment
Ronan Cummins
Marek Rei
52
22
0
21 Jan 2018
Automatic Text Scoring Using Neural Networks
Dimitrios Alikaniotis
H. Yannakoudakis
Marek Rei
65
258
0
14 Jun 2016
1