ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1902.09781
  4. Cited By
Recursive Subtree Composition in LSTM-Based Dependency Parsing

Recursive Subtree Composition in LSTM-Based Dependency Parsing

26 February 2019
Miryam de Lhoneux
Miguel Ballesteros
Joakim Nivre
ArXivPDFHTML

Papers citing "Recursive Subtree Composition in LSTM-Based Dependency Parsing"

13 / 13 papers shown
Title
82 Treebanks, 34 Models: Universal Dependency Parsing with
  Multi-Treebank Models
82 Treebanks, 34 Models: Universal Dependency Parsing with Multi-Treebank Models
Aaron Smith
Bernd Bohnet
Miryam de Lhoneux
Joakim Nivre
Yan Shao
Sara Stymne
60
67
0
06 Sep 2018
An Investigation of the Interactions Between Pre-Trained Word
  Embeddings, Character Models and POS Tags in Dependency Parsing
An Investigation of the Interactions Between Pre-Trained Word Embeddings, Character Models and POS Tags in Dependency Parsing
Aaron Smith
Miryam de Lhoneux
Sara Stymne
Joakim Nivre
49
40
0
27 Aug 2018
Colorless green recurrent networks dream hierarchically
Colorless green recurrent networks dream hierarchically
Kristina Gulordava
Piotr Bojanowski
Edouard Grave
Tal Linzen
Marco Baroni
69
504
0
29 Mar 2018
Deep contextualized word representations
Deep contextualized word representations
Matthew E. Peters
Mark Neumann
Mohit Iyyer
Matt Gardner
Christopher Clark
Kenton Lee
Luke Zettlemoyer
NAI
111
11,520
0
15 Feb 2018
Arc-swift: A Novel Transition System for Dependency Parsing
Arc-swift: A Novel Transition System for Dependency Parsing
Peng Qi
Christopher D. Manning
62
29
0
12 May 2017
What Do Recurrent Neural Network Grammars Learn About Syntax?
What Do Recurrent Neural Network Grammars Learn About Syntax?
Noah A. Smith
Miguel Ballesteros
Lingpeng Kong
Chris Dyer
Graham Neubig
A. Kuncoro
GNN
50
147
0
17 Nov 2016
Deep Biaffine Attention for Neural Dependency Parsing
Deep Biaffine Attention for Neural Dependency Parsing
Timothy Dozat
Christopher D. Manning
97
1,220
0
06 Nov 2016
Assessing the Ability of LSTMs to Learn Syntax-Sensitive Dependencies
Assessing the Ability of LSTMs to Learn Syntax-Sensitive Dependencies
Tal Linzen
Emmanuel Dupoux
Yoav Goldberg
80
901
0
04 Nov 2016
Simple and Accurate Dependency Parsing Using Bidirectional LSTM Feature
  Representations
Simple and Accurate Dependency Parsing Using Bidirectional LSTM Feature Representations
E. Kiperwasser
Yoav Goldberg
57
665
0
14 Mar 2016
Easy-First Dependency Parsing with Hierarchical Tree LSTMs
Easy-First Dependency Parsing with Hierarchical Tree LSTMs
E. Kiperwasser
Yoav Goldberg
33
65
0
01 Mar 2016
Recurrent Neural Network Grammars
Recurrent Neural Network Grammars
Chris Dyer
A. Kuncoro
Miguel Ballesteros
Noah A. Smith
GNN
71
524
0
25 Feb 2016
Improved Transition-Based Parsing by Modeling Characters instead of
  Words with LSTMs
Improved Transition-Based Parsing by Modeling Characters instead of Words with LSTMs
Miguel Ballesteros
Chris Dyer
Noah A. Smith
49
299
0
04 Aug 2015
Transition-Based Dependency Parsing with Stack Long Short-Term Memory
Transition-Based Dependency Parsing with Stack Long Short-Term Memory
Chris Dyer
Miguel Ballesteros
Wang Ling
Austin Matthews
Noah A. Smith
110
801
0
29 May 2015
1