Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2405.17951
Cited By
Efficient Time Series Processing for Transformers and State-Space Models through Token Merging
28 May 2024
Leon Götz
Marcel Kollovieh
Stephan Günnemann
Leo Schwinn
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Efficient Time Series Processing for Transformers and State-Space Models through Token Merging"
4 / 4 papers shown
Title
Unified Training of Universal Time Series Forecasting Transformers
Gerald Woo
Chenghao Liu
Akshat Kumar
Caiming Xiong
Silvio Savarese
Doyen Sahoo
AI4TS
117
165
0
04 Feb 2024
Improving Robustness against Real-World and Worst-Case Distribution Shifts through Decision Region Quantification
Leo Schwinn
Leon Bungert
A. Nguyen
René Raab
Falk Pulsmeyer
Doina Precup
Björn Eskofier
Dario Zanca
OOD
56
12
0
19 May 2022
Token Pooling in Vision Transformers
D. Marin
Jen-Hao Rick Chang
Anurag Ranjan
Anish K. Prabhu
Mohammad Rastegari
Oncel Tuzel
ViT
76
66
0
08 Oct 2021
Informer: Beyond Efficient Transformer for Long Sequence Time-Series Forecasting
Haoyi Zhou
Shanghang Zhang
J. Peng
Shuai Zhang
Jianxin Li
Hui Xiong
Wan Zhang
AI4TS
169
3,885
0
14 Dec 2020
1