Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1909.00383
Cited By
Self-Attention with Structural Position Representations
1 September 2019
Xing Wang
Zhaopeng Tu
Longyue Wang
Shuming Shi
MILM
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Self-Attention with Structural Position Representations"
10 / 10 papers shown
Title
Equipping Sketch Patches with Context-Aware Positional Encoding for Graphic Sketch Representation
Sicong Zang
Zhijun Fang
42
0
0
26 Mar 2024
Joint Pre-Training with Speech and Bilingual Text for Direct Speech to Speech Translation
Kun Wei
Long Zhou
Zi-Hua Zhang
Liping Chen
Shujie Liu
Lei He
Jinyu Li
Furu Wei
30
13
0
31 Oct 2022
Pure Transformers are Powerful Graph Learners
Jinwoo Kim
Tien Dat Nguyen
Seonwoo Min
Sungjun Cho
Moontae Lee
Honglak Lee
Seunghoon Hong
43
189
0
06 Jul 2022
Permutation invariant graph-to-sequence model for template-free retrosynthesis and reaction prediction
Zhengkai Tu
Connor W. Coley
30
90
0
19 Oct 2021
The Impact of Positional Encodings on Multilingual Compression
Vinit Ravishankar
Anders Søgaard
25
5
0
11 Sep 2021
How Does Selective Mechanism Improve Self-Attention Networks?
Xinwei Geng
Longyue Wang
Xing Wang
Bing Qin
Ting Liu
Zhaopeng Tu
AAML
39
35
0
03 May 2020
Self-Attention with Cross-Lingual Position Representation
Liang Ding
Longyue Wang
Dacheng Tao
MILM
33
37
0
28 Apr 2020
Towards Better Modeling Hierarchical Structure for Self-Attention with Ordered Neurons
Jie Hao
Xing Wang
Shuming Shi
Jinfeng Zhang
Zhaopeng Tu
29
12
0
04 Sep 2019
What you can cram into a single vector: Probing sentence embeddings for linguistic properties
Alexis Conneau
Germán Kruszewski
Guillaume Lample
Loïc Barrault
Marco Baroni
201
882
0
03 May 2018
A Decomposable Attention Model for Natural Language Inference
Ankur P. Parikh
Oscar Täckström
Dipanjan Das
Jakob Uszkoreit
213
1,367
0
06 Jun 2016
1