ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2012.14642
  4. Cited By
Multiple Structural Priors Guided Self Attention Network for Language
  Understanding

Multiple Structural Priors Guided Self Attention Network for Language Understanding

29 December 2020
Le Qi
Yu Zhang
Qingyu Yin
Ting Liu
ArXivPDFHTML

Papers citing "Multiple Structural Priors Guided Self Attention Network for Language Understanding"

6 / 6 papers shown
Title
Self-Attention with Structural Position Representations
Self-Attention with Structural Position Representations
Xing Wang
Zhaopeng Tu
Longyue Wang
Shuming Shi
MILM
39
72
0
01 Sep 2019
Star-Transformer
Star-Transformer
Qipeng Guo
Xipeng Qiu
Pengfei Liu
Yunfan Shao
Xiangyang Xue
Zheng Zhang
53
262
0
25 Feb 2019
Transformer-XL: Attentive Language Models Beyond a Fixed-Length Context
Transformer-XL: Attentive Language Models Beyond a Fixed-Length Context
Zihang Dai
Zhilin Yang
Yiming Yang
J. Carbonell
Quoc V. Le
Ruslan Salakhutdinov
VLM
126
3,707
0
09 Jan 2019
Distance-based Self-Attention Network for Natural Language Inference
Distance-based Self-Attention Network for Natural Language Inference
Jinbae Im
Sungzoon Cho
59
76
0
06 Dec 2017
Learning to Compose Task-Specific Tree Structures
Learning to Compose Task-Specific Tree Structures
Jihun Choi
Kang Min Yoo
Sang-goo Lee
57
189
0
10 Jul 2017
A Broad-Coverage Challenge Corpus for Sentence Understanding through
  Inference
A Broad-Coverage Challenge Corpus for Sentence Understanding through Inference
Adina Williams
Nikita Nangia
Samuel R. Bowman
376
4,444
0
18 Apr 2017
1