Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2012.14642
Cited By
Multiple Structural Priors Guided Self Attention Network for Language Understanding
29 December 2020
Le Qi
Yu Zhang
Qingyu Yin
Ting Liu
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Multiple Structural Priors Guided Self Attention Network for Language Understanding"
6 / 6 papers shown
Title
Self-Attention with Structural Position Representations
Xing Wang
Zhaopeng Tu
Longyue Wang
Shuming Shi
MILM
39
72
0
01 Sep 2019
Star-Transformer
Qipeng Guo
Xipeng Qiu
Pengfei Liu
Yunfan Shao
Xiangyang Xue
Zheng Zhang
53
262
0
25 Feb 2019
Transformer-XL: Attentive Language Models Beyond a Fixed-Length Context
Zihang Dai
Zhilin Yang
Yiming Yang
J. Carbonell
Quoc V. Le
Ruslan Salakhutdinov
VLM
126
3,707
0
09 Jan 2019
Distance-based Self-Attention Network for Natural Language Inference
Jinbae Im
Sungzoon Cho
59
76
0
06 Dec 2017
Learning to Compose Task-Specific Tree Structures
Jihun Choi
Kang Min Yoo
Sang-goo Lee
57
189
0
10 Jul 2017
A Broad-Coverage Challenge Corpus for Sentence Understanding through Inference
Adina Williams
Nikita Nangia
Samuel R. Bowman
376
4,444
0
18 Apr 2017
1