ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2405.17951
  4. Cited By
Efficient Time Series Processing for Transformers and State-Space Models
  through Token Merging

Efficient Time Series Processing for Transformers and State-Space Models through Token Merging

28 May 2024
Leon Götz
Marcel Kollovieh
Stephan Günnemann
Leo Schwinn
ArXivPDFHTML

Papers citing "Efficient Time Series Processing for Transformers and State-Space Models through Token Merging"

4 / 4 papers shown
Title
Unified Training of Universal Time Series Forecasting Transformers
Unified Training of Universal Time Series Forecasting Transformers
Gerald Woo
Chenghao Liu
Akshat Kumar
Caiming Xiong
Silvio Savarese
Doyen Sahoo
AI4TS
117
165
0
04 Feb 2024
Improving Robustness against Real-World and Worst-Case Distribution
  Shifts through Decision Region Quantification
Improving Robustness against Real-World and Worst-Case Distribution Shifts through Decision Region Quantification
Leo Schwinn
Leon Bungert
A. Nguyen
René Raab
Falk Pulsmeyer
Doina Precup
Björn Eskofier
Dario Zanca
OOD
56
12
0
19 May 2022
Token Pooling in Vision Transformers
Token Pooling in Vision Transformers
D. Marin
Jen-Hao Rick Chang
Anurag Ranjan
Anish K. Prabhu
Mohammad Rastegari
Oncel Tuzel
ViT
76
66
0
08 Oct 2021
Informer: Beyond Efficient Transformer for Long Sequence Time-Series
  Forecasting
Informer: Beyond Efficient Transformer for Long Sequence Time-Series Forecasting
Haoyi Zhou
Shanghang Zhang
J. Peng
Shuai Zhang
Jianxin Li
Hui Xiong
Wan Zhang
AI4TS
169
3,885
0
14 Dec 2020
1