Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2310.19676
Cited By
HyPE: Attention with Hyperbolic Biases for Relative Positional Encoding
30 October 2023
Giorgio Angelotti
Re-assign community
ArXiv (abs)
PDF
HTML
Papers citing
"HyPE: Attention with Hyperbolic Biases for Relative Positional Encoding"
5 / 5 papers shown
Title
Train Short, Test Long: Attention with Linear Biases Enables Input Length Extrapolation
Ofir Press
Noah A. Smith
M. Lewis
336
775
0
27 Aug 2021
Offline Reinforcement Learning as One Big Sequence Modeling Problem
Michael Janner
Qiyang Li
Sergey Levine
OffRL
158
685
0
03 Jun 2021
RoFormer: Enhanced Transformer with Rotary Position Embedding
Jianlin Su
Yu Lu
Shengfeng Pan
Ahmed Murtadha
Bo Wen
Yunfeng Liu
284
2,521
0
20 Apr 2021
Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer
Colin Raffel
Noam M. Shazeer
Adam Roberts
Katherine Lee
Sharan Narang
Michael Matena
Yanqi Zhou
Wei Li
Peter J. Liu
AIMat
459
20,298
0
23 Oct 2019
Self-Attention with Relative Position Representations
Peter Shaw
Jakob Uszkoreit
Ashish Vaswani
177
2,296
0
06 Mar 2018
1