Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1912.00835
Cited By
Low Rank Factorization for Compact Multi-Head Self-Attention
26 November 2019
Sneha Mehta
Huzefa Rangwala
Naren Ramakrishnan
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Low Rank Factorization for Compact Multi-Head Self-Attention"
5 / 5 papers shown
Title
Survey on Computer Vision Techniques for Internet-of-Things Devices
Ishmeet Kaur
Adwaita Janardhan Jadhav
AI4CE
27
1
0
02 Aug 2023
Is Attention Better Than Matrix Decomposition?
Zhengyang Geng
Meng-Hao Guo
Hongxu Chen
Xia Li
Ke Wei
Zhouchen Lin
62
137
0
09 Sep 2021
A Decomposable Attention Model for Natural Language Inference
Ankur P. Parikh
Oscar Täckström
Dipanjan Das
Jakob Uszkoreit
213
1,367
0
06 Jun 2016
Effective Approaches to Attention-based Neural Machine Translation
Thang Luong
Hieu H. Pham
Christopher D. Manning
218
7,926
0
17 Aug 2015
Convolutional Neural Networks for Sentence Classification
Yoon Kim
AILaw
VLM
270
13,368
0
25 Aug 2014
1