ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1904.03746
  4. Cited By
Unsupervised Recurrent Neural Network Grammars

Unsupervised Recurrent Neural Network Grammars

7 April 2019
Yoon Kim
Alexander M. Rush
Lei Yu
A. Kuncoro
Chris Dyer
Gábor Melis
    LRM
    RALM
    SSL
ArXivPDFHTML

Papers citing "Unsupervised Recurrent Neural Network Grammars"

21 / 21 papers shown
Title
Improving Unsupervised Constituency Parsing via Maximizing Semantic Information
Improving Unsupervised Constituency Parsing via Maximizing Semantic Information
Junjie Chen
Xiangheng He
Yusuke Miyao
Danushka Bollegala
43
0
0
03 Oct 2024
Activity Grammars for Temporal Action Segmentation
Activity Grammars for Temporal Action Segmentation
Dayoung Gong
Joonseok Lee
Deunsol Jung
Suha Kwak
Minsu Cho
33
7
0
07 Dec 2023
Augmenting Transformers with Recursively Composed Multi-grained
  Representations
Augmenting Transformers with Recursively Composed Multi-grained Representations
Xiang Hu
Qingyang Zhu
Kewei Tu
Wei Wu
29
3
0
28 Sep 2023
Learning a Grammar Inducer from Massive Uncurated Instructional Videos
Learning a Grammar Inducer from Massive Uncurated Instructional Videos
Songyang Zhang
Linfeng Song
Lifeng Jin
Haitao Mi
Kun Xu
Dong Yu
Jiebo Luo
38
5
0
22 Oct 2022
Revisiting the Practical Effectiveness of Constituency Parse Extraction
  from Pre-trained Language Models
Revisiting the Practical Effectiveness of Constituency Parse Extraction from Pre-trained Language Models
Taeuk Kim
37
1
0
15 Sep 2022
Exploiting Inductive Bias in Transformers for Unsupervised
  Disentanglement of Syntax and Semantics with VAEs
Exploiting Inductive Bias in Transformers for Unsupervised Disentanglement of Syntax and Semantics with VAEs
G. Felhi
Joseph Le Roux
Djamé Seddah
DRL
26
2
0
12 May 2022
Dependency-based Mixture Language Models
Dependency-based Mixture Language Models
Zhixian Yang
Xiaojun Wan
49
2
0
19 Mar 2022
Fast-R2D2: A Pretrained Recursive Neural Network based on Pruned CKY for
  Grammar Induction and Text Representation
Fast-R2D2: A Pretrained Recursive Neural Network based on Pruned CKY for Grammar Induction and Text Representation
Xiang Hu
Haitao Mi
Liang Li
Gerard de Melo
29
13
0
01 Mar 2022
Scaling Structured Inference with Randomization
Scaling Structured Inference with Randomization
Yao Fu
John P. Cunningham
Mirella Lapata
BDL
32
2
0
07 Dec 2021
Interpreting Deep Learning Models in Natural Language Processing: A
  Review
Interpreting Deep Learning Models in Natural Language Processing: A Review
Xiaofei Sun
Diyi Yang
Xiaoya Li
Tianwei Zhang
Yuxian Meng
Han Qiu
Guoyin Wang
Eduard H. Hovy
Jiwei Li
17
44
0
20 Oct 2021
Co-training an Unsupervised Constituency Parser with Weak Supervision
Co-training an Unsupervised Constituency Parser with Weak Supervision
Nickil Maveli
Shay B. Cohen
SSL
49
3
0
05 Oct 2021
The Limitations of Limited Context for Constituency Parsing
The Limitations of Limited Context for Constituency Parsing
Yuchen Li
Andrej Risteski
26
4
0
03 Jun 2021
Learning Syntax from Naturally-Occurring Bracketings
Learning Syntax from Naturally-Occurring Bracketings
Tianze Shi
Ozan Irsoy
Igor Malioutov
Lillian Lee
30
5
0
28 Apr 2021
PCFGs Can Do Better: Inducing Probabilistic Context-Free Grammars with
  Many Symbols
PCFGs Can Do Better: Inducing Probabilistic Context-Free Grammars with Many Symbols
Songlin Yang
Yanpeng Zhao
Kewei Tu
23
22
0
28 Apr 2021
Refining Targeted Syntactic Evaluation of Language Models
Refining Targeted Syntactic Evaluation of Language Models
Benjamin Newman
Kai-Siang Ang
Julia Gong
John Hewitt
29
43
0
19 Apr 2021
Syntactic Structure Distillation Pretraining For Bidirectional Encoders
Syntactic Structure Distillation Pretraining For Bidirectional Encoders
A. Kuncoro
Lingpeng Kong
Daniel Fried
Dani Yogatama
Laura Rimell
Chris Dyer
Phil Blunsom
31
33
0
27 May 2020
Posterior Control of Blackbox Generation
Posterior Control of Blackbox Generation
Xiang Lisa Li
Alexander M. Rush
19
25
0
10 May 2020
Are Pre-trained Language Models Aware of Phrases? Simple but Strong
  Baselines for Grammar Induction
Are Pre-trained Language Models Aware of Phrases? Simple but Strong Baselines for Grammar Induction
Taeuk Kim
Jihun Choi
Daniel Edmiston
Sang-goo Lee
22
90
0
30 Jan 2020
Neural Machine Translation: A Review and Survey
Neural Machine Translation: A Review and Survey
Felix Stahlberg
3DV
AI4TS
MedIm
20
311
0
04 Dec 2019
PaLM: A Hybrid Parser and Language Model
PaLM: A Hybrid Parser and Language Model
Hao Peng
Roy Schwartz
Noah A. Smith
AIMat
23
15
0
04 Sep 2019
Scalable Syntax-Aware Language Models Using Knowledge Distillation
Scalable Syntax-Aware Language Models Using Knowledge Distillation
A. Kuncoro
Chris Dyer
Laura Rimell
S. Clark
Phil Blunsom
35
26
0
14 Jun 2019
1