Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2206.09457
Cited By
v1
v2 (latest)
All you need is feedback: Communication with block attention feedback codes
19 June 2022
Emre Ozfatura
Yulin Shao
A. Perotti
B. Popović
Deniz Gunduz
Re-assign community
ArXiv (abs)
PDF
HTML
Papers citing
"All you need is feedback: Communication with block attention feedback codes"
18 / 18 papers shown
Title
Semantic Communications with Discrete-time Analog Transmission: A PAPR Perspective
Yulin Shao
Deniz Gunduz
67
32
0
17 Aug 2022
AttentionCode: Ultra-Reliable Feedback Codes for Short-Packet Communications
Yulin Shao
Emre Ozfatura
A. Perotti
B. Popović
Deniz Gunduz
59
23
0
30 May 2022
Transformer-Empowered 6G Intelligent Networks: From Massive MIMO Processing to Semantic Communication
Yang Wang
Zhen Gao
Dezhi Zheng
Sheng Chen
Deniz Gündüz
H. Vincent Poor
AI4CE
65
93
0
08 May 2022
DRF Codes: Deep SNR-Robust Feedback Codes
Mahdi Boloursaz Mashhadi
Deniz Gunduz
A. Perotti
B. Popović
52
11
0
22 Dec 2021
A Survey of Transformers
Tianyang Lin
Yuxin Wang
Xiangyang Liu
Xipeng Qiu
ViT
172
1,131
0
08 Jun 2021
Deep Extended Feedback Codes
A. Safavi
A. Perotti
B. Popović
Mahdi Boloursaz Mashhadi
Deniz Gunduz
73
14
0
04 May 2021
Transformer Feed-Forward Layers Are Key-Value Memories
Mor Geva
R. Schuster
Jonathan Berant
Omer Levy
KELM
182
847
0
29 Dec 2020
Rethinking Attention with Performers
K. Choromanski
Valerii Likhosherstov
David Dohan
Xingyou Song
Andreea Gane
...
Afroz Mohiuddin
Lukasz Kaiser
David Belanger
Lucy J. Colwell
Adrian Weller
186
1,602
0
30 Sep 2020
Linformer: Self-Attention with Linear Complexity
Sinong Wang
Belinda Z. Li
Madian Khabsa
Han Fang
Hao Ma
219
1,716
0
08 Jun 2020
On Layer Normalization in the Transformer Architecture
Ruibin Xiong
Yunchang Yang
Di He
Kai Zheng
Shuxin Zheng
Chen Xing
Huishuai Zhang
Yanyan Lan
Liwei Wang
Tie-Yan Liu
AI4CE
151
996
0
12 Feb 2020
Reformer: The Efficient Transformer
Nikita Kitaev
Lukasz Kaiser
Anselm Levskaya
VLM
207
2,333
0
13 Jan 2020
BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
Jacob Devlin
Ming-Wei Chang
Kenton Lee
Kristina Toutanova
VLM
SSL
SSeg
1.8K
95,229
0
11 Oct 2018
Deepcode: Feedback Codes via Deep Learning
Hyeji Kim
Yihan Jiang
Sreeram Kannan
Sewoong Oh
Pramod Viswanath
53
145
0
02 Jul 2018
Attention Is All You Need
Ashish Vaswani
Noam M. Shazeer
Niki Parmar
Jakob Uszkoreit
Llion Jones
Aidan Gomez
Lukasz Kaiser
Illia Polosukhin
3DV
803
132,454
0
12 Jun 2017
Gaussian Error Linear Units (GELUs)
Dan Hendrycks
Kevin Gimpel
174
5,049
0
27 Jun 2016
On the Properties of Neural Machine Translation: Encoder-Decoder Approaches
Kyunghyun Cho
B. V. Merrienboer
Dzmitry Bahdanau
Yoshua Bengio
AI4CE
AIMat
259
6,791
0
03 Sep 2014
Long Short-Term Memory Based Recurrent Neural Network Architectures for Large Vocabulary Speech Recognition
Hasim Sak
A. Senior
F. Beaufays
111
1,052
0
05 Feb 2014
Distributed Representations of Words and Phrases and their Compositionality
Tomas Mikolov
Ilya Sutskever
Kai Chen
G. Corrado
J. Dean
NAI
OCL
402
33,573
0
16 Oct 2013
1