ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1805.12471
  4. Cited By
Neural Network Acceptability Judgments
v1v2v3 (latest)

Neural Network Acceptability Judgments

31 May 2018
Alex Warstadt
Amanpreet Singh
Samuel R. Bowman
ArXiv (abs)PDFHTML

Papers citing "Neural Network Acceptability Judgments"

44 / 894 papers shown
Title
Multilingual Question Answering from Formatted Text applied to
  Conversational Agents
Multilingual Question Answering from Formatted Text applied to Conversational Agents
W. Siblini
Charlotte Pasqual
Axel Lavielle
Mohamed Challal
Cyril Cauchois
60
19
0
10 Oct 2019
Knowledge Distillation from Internal Representations
Knowledge Distillation from Internal Representations
Gustavo Aguilar
Yuan Ling
Yu Zhang
Benjamin Yao
Xing Fan
Edward Guo
106
181
0
08 Oct 2019
MinWikiSplit: A Sentence Splitting Corpus with Minimal Propositions
MinWikiSplit: A Sentence Splitting Corpus with Minimal Propositions
C. Niklaus
André Freitas
Siegfried Handschuh
78
15
0
26 Sep 2019
ALBERT: A Lite BERT for Self-supervised Learning of Language
  Representations
ALBERT: A Lite BERT for Self-supervised Learning of Language Representations
Zhenzhong Lan
Mingda Chen
Sebastian Goodman
Kevin Gimpel
Piyush Sharma
Radu Soricut
SSLAIMat
415
6,479
0
26 Sep 2019
FreeLB: Enhanced Adversarial Training for Natural Language Understanding
FreeLB: Enhanced Adversarial Training for Natural Language Understanding
Chen Zhu
Yu Cheng
Zhe Gan
S. Sun
Tom Goldstein
Jingjing Liu
AAML
296
443
0
25 Sep 2019
CAT: Compression-Aware Training for bandwidth reduction
CAT: Compression-Aware Training for bandwidth reduction
Chaim Baskin
Brian Chmiel
Evgenii Zheltonozhskii
Ron Banner
A. Bronstein
A. Mendelson
MQ
67
12
0
25 Sep 2019
Mixout: Effective Regularization to Finetune Large-scale Pretrained
  Language Models
Mixout: Effective Regularization to Finetune Large-scale Pretrained Language Models
Cheolhyoung Lee
Kyunghyun Cho
Wanmo Kang
MoE
291
209
0
25 Sep 2019
TinyBERT: Distilling BERT for Natural Language Understanding
TinyBERT: Distilling BERT for Natural Language Understanding
Xiaoqi Jiao
Yichun Yin
Lifeng Shang
Xin Jiang
Xiao Chen
Linlin Li
F. Wang
Qun Liu
VLM
126
1,881
0
23 Sep 2019
Slice-based Learning: A Programming Model for Residual Learning in
  Critical Data Slices
Slice-based Learning: A Programming Model for Residual Learning in Critical Data Slices
V. Chen
Sen Wu
Zhenzhen Weng
Alexander Ratner
Christopher Ré
96
56
0
13 Sep 2019
Learning to Discriminate Perturbations for Blocking Adversarial Attacks
  in Text Classification
Learning to Discriminate Perturbations for Blocking Adversarial Attacks in Text Classification
Yichao Zhou
Jyun-Yu Jiang
Kai-Wei Chang
Wei Wang
AAML
66
119
0
06 Sep 2019
Investigating BERT's Knowledge of Language: Five Analysis Methods with
  NPIs
Investigating BERT's Knowledge of Language: Five Analysis Methods with NPIs
Alex Warstadt
Yuning Cao
Ioana Grosu
Wei Peng
Hagen Blix
...
Jason Phang
Anhad Mohananey
Phu Mon Htut
Paloma Jeretic
Samuel R. Bowman
71
123
0
05 Sep 2019
Specializing Unsupervised Pretraining Models for Word-Level Semantic
  Similarity
Specializing Unsupervised Pretraining Models for Word-Level Semantic Similarity
Anne Lauscher
Ivan Vulić
Edoardo Ponti
Anna Korhonen
Goran Glavaš
SSL
85
58
0
05 Sep 2019
Semantics-aware BERT for Language Understanding
Semantics-aware BERT for Language Understanding
Zhuosheng Zhang
Yuwei Wu
Zhao Hai
Z. Li
Shuailiang Zhang
Xi Zhou
Xiang Zhou
61
370
0
05 Sep 2019
Transfer Fine-Tuning: A BERT Case Study
Transfer Fine-Tuning: A BERT Case Study
Yuki Arase
Junichi Tsujii
37
41
0
03 Sep 2019
Investigating Meta-Learning Algorithms for Low-Resource Natural Language
  Understanding Tasks
Investigating Meta-Learning Algorithms for Low-Resource Natural Language Understanding Tasks
Zi-Yi Dou
Keyi Yu
Antonios Anastasopoulos
71
127
0
27 Aug 2019
Empirical Evaluation of Multi-task Learning in Deep Neural Networks for
  Natural Language Processing
Empirical Evaluation of Multi-task Learning in Deep Neural Networks for Natural Language Processing
Jianquan Li
Xiaokang Liu
Wenpeng Yin
Min Yang
Liqun Ma
Yaohong Jin
AIMat
75
14
0
16 Aug 2019
StructBERT: Incorporating Language Structures into Pre-training for Deep
  Language Understanding
StructBERT: Incorporating Language Structures into Pre-training for Deep Language Understanding
Wei Wang
Bin Bi
Ming Yan
Chen Henry Wu
Zuyi Bao
Jiangnan Xia
Liwei Peng
Luo Si
87
264
0
13 Aug 2019
On Identifiability in Transformers
On Identifiability in Transformers
Gino Brunner
Yang Liu
Damian Pascual
Oliver Richter
Massimiliano Ciaramita
Roger Wattenhofer
ViT
77
189
0
12 Aug 2019
ERNIE 2.0: A Continual Pre-training Framework for Language Understanding
ERNIE 2.0: A Continual Pre-training Framework for Language Understanding
Yu Sun
Shuohuan Wang
Yukun Li
Shikun Feng
Hao Tian
Hua Wu
Haifeng Wang
CLL
110
813
0
29 Jul 2019
RoBERTa: A Robustly Optimized BERT Pretraining Approach
RoBERTa: A Robustly Optimized BERT Pretraining Approach
Yinhan Liu
Myle Ott
Naman Goyal
Jingfei Du
Mandar Joshi
Danqi Chen
Omer Levy
M. Lewis
Luke Zettlemoyer
Veselin Stoyanov
AIMat
782
24,613
0
26 Jul 2019
SpanBERT: Improving Pre-training by Representing and Predicting Spans
SpanBERT: Improving Pre-training by Representing and Predicting Spans
Mandar Joshi
Danqi Chen
Yinhan Liu
Daniel S. Weld
Luke Zettlemoyer
Omer Levy
197
1,974
0
24 Jul 2019
A Pragmatics-Centered Evaluation Framework for Natural Language
  Understanding
A Pragmatics-Centered Evaluation Framework for Natural Language Understanding
Damien Sileo
Tim Van de Cruys
Camille Pradel
Philippe Muller
ELM
34
3
0
19 Jul 2019
BAM! Born-Again Multi-Task Networks for Natural Language Understanding
BAM! Born-Again Multi-Task Networks for Natural Language Understanding
Kevin Clark
Minh-Thang Luong
Urvashi Khandelwal
Christopher D. Manning
Quoc V. Le
81
230
0
10 Jul 2019
A Comparative Analysis of Knowledge-Intensive and Data-Intensive
  Semantic Parsers
A Comparative Analysis of Knowledge-Intensive and Data-Intensive Semantic Parsers
Junjie Cao
Zi-yu Lin
Weiwei SUN
Xiaojun Wan
27
1
0
04 Jul 2019
Understanding and Improving Transformer From a Multi-Particle Dynamic
  System Point of View
Understanding and Improving Transformer From a Multi-Particle Dynamic System Point of View
Yiping Lu
Zhuohan Li
Di He
Zhiqing Sun
Bin Dong
Tao Qin
Liwei Wang
Tie-Yan Liu
AI4CE
84
176
0
06 Jun 2019
Are Sixteen Heads Really Better than One?
Are Sixteen Heads Really Better than One?
Paul Michel
Omer Levy
Graham Neubig
MoE
120
1,072
0
25 May 2019
Human vs. Muppet: A Conservative Estimate of Human Performance on the
  GLUE Benchmark
Human vs. Muppet: A Conservative Estimate of Human Performance on the GLUE Benchmark
Nikita Nangia
Samuel R. Bowman
ELMALM
82
76
0
24 May 2019
ERNIE: Enhanced Language Representation with Informative Entities
ERNIE: Enhanced Language Representation with Informative Entities
Zhengyan Zhang
Xu Han
Zhiyuan Liu
Xin Jiang
Maosong Sun
Qun Liu
159
1,404
0
17 May 2019
Unified Language Model Pre-training for Natural Language Understanding
  and Generation
Unified Language Model Pre-training for Natural Language Understanding and Generation
Li Dong
Nan Yang
Wenhui Wang
Furu Wei
Xiaodong Liu
Yu Wang
Jianfeng Gao
M. Zhou
H. Hon
ELMAI4CE
244
1,561
0
08 May 2019
SuperGLUE: A Stickier Benchmark for General-Purpose Language
  Understanding Systems
SuperGLUE: A Stickier Benchmark for General-Purpose Language Understanding Systems
Alex Jinpeng Wang
Yada Pruksachatkun
Nikita Nangia
Amanpreet Singh
Julian Michael
Felix Hill
Omer Levy
Samuel R. Bowman
ELM
353
2,328
0
02 May 2019
Probing What Different NLP Tasks Teach Machines about Function Word
  Comprehension
Probing What Different NLP Tasks Teach Machines about Function Word Comprehension
Najoung Kim
Roma Patel
Adam Poliak
Alex Jinpeng Wang
Patrick Xia
...
Alexis Ross
Tal Linzen
Benjamin Van Durme
Samuel R. Bowman
Ellie Pavlick
78
107
0
25 Apr 2019
Recent Advances in Natural Language Inference: A Survey of Benchmarks,
  Resources, and Approaches
Recent Advances in Natural Language Inference: A Survey of Benchmarks, Resources, and Approaches
Shane Storks
Qiaozi Gao
J. Chai
96
132
0
02 Apr 2019
To Tune or Not to Tune? Adapting Pretrained Representations to Diverse
  Tasks
To Tune or Not to Tune? Adapting Pretrained Representations to Diverse Tasks
Matthew E. Peters
Sebastian Ruder
Noah A. Smith
109
441
0
14 Mar 2019
Human acceptability judgements for extractive sentence compression
Human acceptability judgements for extractive sentence compression
Abram Handler
Brian Dillon
Brendan O'Connor
33
2
0
01 Feb 2019
Multi-Task Deep Neural Networks for Natural Language Understanding
Multi-Task Deep Neural Networks for Natural Language Understanding
Xiaodong Liu
Pengcheng He
Weizhu Chen
Jianfeng Gao
AI4CE
158
1,273
0
31 Jan 2019
NeuNetS: An Automated Synthesis Engine for Neural Network Design
NeuNetS: An Automated Synthesis Engine for Neural Network Design
Atin Sood
Benjamin Elder
Benjamin Herta
Chao Xue
C. Bekas
...
M. Choudhury
Rong Yan
R. Istrate
Ruchi Puri
Tejaswini Pedapati
SyDa
62
7
0
17 Jan 2019
Linguistic Analysis of Pretrained Sentence Encoders with Acceptability
  Judgments
Linguistic Analysis of Pretrained Sentence Encoders with Acceptability Judgments
Alex Warstadt
Samuel R. Bowman
92
23
0
11 Jan 2019
Can You Tell Me How to Get Past Sesame Street? Sentence-Level
  Pretraining Beyond Language Modeling
Can You Tell Me How to Get Past Sesame Street? Sentence-Level Pretraining Beyond Language Modeling
Alex Jinpeng Wang
Jan Hula
Patrick Xia
R. Pappagari
R. Thomas McCoy
...
Berlin Chen
Benjamin Van Durme
Edouard Grave
Ellie Pavlick
Samuel R. Bowman
LRMVLM
78
27
0
28 Dec 2018
Verb Argument Structure Alternations in Word and Sentence Embeddings
Verb Argument Structure Alternations in Word and Sentence Embeddings
Katharina Kann
Alex Warstadt
Adina Williams
Samuel R. Bowman
62
50
0
27 Nov 2018
Sentence Encoders on STILTs: Supplementary Training on Intermediate
  Labeled-data Tasks
Sentence Encoders on STILTs: Supplementary Training on Intermediate Labeled-data Tasks
Jason Phang
Thibault Févry
Samuel R. Bowman
123
470
0
02 Nov 2018
BERT: Pre-training of Deep Bidirectional Transformers for Language
  Understanding
BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
Jacob Devlin
Ming-Wei Chang
Kenton Lee
Kristina Toutanova
VLMSSLSSeg
1.9K
95,531
0
11 Oct 2018
Targeted Syntactic Evaluation of Language Models
Targeted Syntactic Evaluation of Language Models
Rebecca Marvin
Tal Linzen
94
417
0
27 Aug 2018
Unsupervised Learning of Sentence Representations Using Sequence
  Consistency
Unsupervised Learning of Sentence Representations Using Sequence Consistency
Siddhartha Brahma
SSL
87
8
0
10 Aug 2018
GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language
  Understanding
GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language Understanding
Alex Jinpeng Wang
Amanpreet Singh
Julian Michael
Felix Hill
Omer Levy
Samuel R. Bowman
ELM
1.2K
7,210
0
20 Apr 2018
Previous
123...161718