Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2107.00967
Cited By
R2D2: Recursive Transformer based on Differentiable Tree for Interpretable Hierarchical Language Modeling
2 July 2021
Xiang Hu
Haitao Mi
Zujie Wen
Yafang Wang
Yi Su
Jing Zheng
Gerard de Melo
Re-assign community
ArXiv
PDF
HTML
Papers citing
"R2D2: Recursive Transformer based on Differentiable Tree for Interpretable Hierarchical Language Modeling"
7 / 7 papers shown
Title
Sneaking Syntax into Transformer Language Models with Tree Regularization
Ananjan Nandi
Christopher D. Manning
Shikhar Murty
80
0
0
28 Nov 2024
Banyan: Improved Representation Learning with Explicit Structure
Mattia Opper
N. Siddharth
42
1
0
25 Jul 2024
Self-StrAE at SemEval-2024 Task 1: Making Self-Structuring AutoEncoders Learn More With Less
Mattia Opper
Siddharth Narayanaswamy
44
3
0
02 Apr 2024
Augmenting Transformers with Recursively Composed Multi-grained Representations
Xiang Hu
Qingyang Zhu
Kewei Tu
Wei Wu
45
4
0
28 Sep 2023
A Multi-Grained Self-Interpretable Symbolic-Neural Model For Single/Multi-Labeled Text Classification
Xiang Hu
Xinyu Kong
Kewei Tu
MILM
BDL
44
5
0
06 Mar 2023
Forming Trees with Treeformers
Nilay Patel
Jeffrey Flanigan
AI4CE
49
3
0
14 Jul 2022
Fast-R2D2: A Pretrained Recursive Neural Network based on Pruned CKY for Grammar Induction and Text Representation
Xiang Hu
Haitao Mi
Liang Li
Gerard de Melo
39
14
0
01 Mar 2022
1