ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2107.00967
  4. Cited By
R2D2: Recursive Transformer based on Differentiable Tree for
  Interpretable Hierarchical Language Modeling

R2D2: Recursive Transformer based on Differentiable Tree for Interpretable Hierarchical Language Modeling

2 July 2021
Xiang Hu
Haitao Mi
Zujie Wen
Yafang Wang
Yi Su
Jing Zheng
Gerard de Melo
ArXivPDFHTML

Papers citing "R2D2: Recursive Transformer based on Differentiable Tree for Interpretable Hierarchical Language Modeling"

7 / 7 papers shown
Title
Sneaking Syntax into Transformer Language Models with Tree Regularization
Sneaking Syntax into Transformer Language Models with Tree Regularization
Ananjan Nandi
Christopher D. Manning
Shikhar Murty
80
0
0
28 Nov 2024
Banyan: Improved Representation Learning with Explicit Structure
Banyan: Improved Representation Learning with Explicit Structure
Mattia Opper
N. Siddharth
42
1
0
25 Jul 2024
Self-StrAE at SemEval-2024 Task 1: Making Self-Structuring AutoEncoders
  Learn More With Less
Self-StrAE at SemEval-2024 Task 1: Making Self-Structuring AutoEncoders Learn More With Less
Mattia Opper
Siddharth Narayanaswamy
44
3
0
02 Apr 2024
Augmenting Transformers with Recursively Composed Multi-grained
  Representations
Augmenting Transformers with Recursively Composed Multi-grained Representations
Xiang Hu
Qingyang Zhu
Kewei Tu
Wei Wu
45
4
0
28 Sep 2023
A Multi-Grained Self-Interpretable Symbolic-Neural Model For
  Single/Multi-Labeled Text Classification
A Multi-Grained Self-Interpretable Symbolic-Neural Model For Single/Multi-Labeled Text Classification
Xiang Hu
Xinyu Kong
Kewei Tu
MILM
BDL
44
5
0
06 Mar 2023
Forming Trees with Treeformers
Forming Trees with Treeformers
Nilay Patel
Jeffrey Flanigan
AI4CE
49
3
0
14 Jul 2022
Fast-R2D2: A Pretrained Recursive Neural Network based on Pruned CKY for
  Grammar Induction and Text Representation
Fast-R2D2: A Pretrained Recursive Neural Network based on Pruned CKY for Grammar Induction and Text Representation
Xiang Hu
Haitao Mi
Liang Li
Gerard de Melo
39
14
0
01 Mar 2022
1