ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2410.10469
  4. Cited By
Moirai-MoE: Empowering Time Series Foundation Models with Sparse Mixture
  of Experts

Moirai-MoE: Empowering Time Series Foundation Models with Sparse Mixture of Experts

14 October 2024
Xu Liu
Juncheng Liu
Gerald Woo
Taha Aksu
Yuxuan Liang
Roger Zimmermann
Chenghao Liu
Silvio Savarese
Caiming Xiong
Doyen Sahoo
    AI4TS
ArXiv (abs)PDFHTML

Papers citing "Moirai-MoE: Empowering Time Series Foundation Models with Sparse Mixture of Experts"

18 / 18 papers shown
Title
TRACE: Grounding Time Series in Context for Multimodal Embedding and Retrieval
Jialin Chen
Ziyu Zhao
Gaukhar Nurbek
Aosong Feng
Ali Maatouk
Leandros Tassiulas
Yifeng Gao
Rex Ying
AI4TS
52
0
0
10 Jun 2025
Time Series Representations for Classification Lie Hidden in Pretrained Vision Transformers
Simon Roschmann
Quentin Bouniot
Vasilii Feofanov
I. Redko
Zeynep Akata
AI4TS
43
0
0
10 Jun 2025
MIRA: Medical Time Series Foundation Model for Real-World Health Data
MIRA: Medical Time Series Foundation Model for Real-World Health Data
Hao Li
Bowen Deng
Chang Xu
Zhiyuan Feng
Viktor Schlegel
...
Yizheng Sun
Jingyuan Sun
Kailai Yang
Yiyao Yu
Jiang Bian
AI4TSOODAI4CE
62
0
0
09 Jun 2025
Mixture-of-Experts for Personalized and Semantic-Aware Next Location Prediction
Mixture-of-Experts for Personalized and Semantic-Aware Next Location Prediction
Shuai Liu
Ning Cao
Yile Chen
Yue Jiang
Gao Cong
38
0
0
30 May 2025
BLAST: Balanced Sampling Time Series Corpus for Universal Forecasting Models
BLAST: Balanced Sampling Time Series Corpus for Universal Forecasting Models
Zezhi Shao
Yujie Li
Fei Wang
Chengqing Yu
Yisong Fu
Tangwen Qian
Bin Xu
Boyu Diao
Yongjun Xu
Xueqi Cheng
AI4TS
92
0
0
23 May 2025
Time Tracker: Mixture-of-Experts-Enhanced Foundation Time Series Forecasting Model with Decoupled Training Pipelines
Time Tracker: Mixture-of-Experts-Enhanced Foundation Time Series Forecasting Model with Decoupled Training Pipelines
Xiaohou Shi
Ke Li
Aobo Liang
Yan Sun
AI4TS
120
0
0
21 May 2025
Towards a Foundation Model for Communication Systems
Towards a Foundation Model for Communication Systems
Davide Buffelli
Sowmen Das
Yu-Wei Lin
Sattar Vakili
Chien-Yi Wang
Masoud Attarifar
Pritthijit Nath
Da-shan Shiu
112
0
0
20 May 2025
This Time is Different: An Observability Perspective on Time Series Foundation Models
This Time is Different: An Observability Perspective on Time Series Foundation Models
Ben Cohen
Emaad Khwaja
Youssef Doubli
Salahidine Lemaachi
Chris Lettieri
...
Kan Wang
Stephan Xie
David Asker
Ameet Talwalkar
Othmane Abou-Amal
AI4TSAI4CE
85
0
0
20 May 2025
True Zero-Shot Inference of Dynamical Systems Preserving Long-Term Statistics
True Zero-Shot Inference of Dynamical Systems Preserving Long-Term Statistics
Christoph Jürgen Hemmer
Daniel Durstewitz
AI4TSSyDaAI4CE
299
1
0
19 May 2025
Learning Soft Sparse Shapes for Efficient Time-Series Classification
Learning Soft Sparse Shapes for Efficient Time-Series Classification
Zhen Liu
Yicheng Luo
Yangqiu Song
Emadeldeen Eldele
Min-man Wu
Qianli Ma
AI4TS
136
0
0
11 May 2025
FT-MoE: Sustainable-learning Mixture of Experts Model for Fault-Tolerant Computing with Multiple Tasks
FT-MoE: Sustainable-learning Mixture of Experts Model for Fault-Tolerant Computing with Multiple Tasks
Wenjing Xiao
Wenhao Song
Miaojiang Chen
Ruikun Luo
Min Chen
MoE
465
0
0
29 Apr 2025
CITRAS: Covariate-Informed Transformer for Time Series Forecasting
CITRAS: Covariate-Informed Transformer for Time Series Forecasting
Yosuke Yamaguchi
Issei Suemitsu
Wenpeng Wei
AI4TS
114
2
0
31 Mar 2025
NdLinear: Don't Flatten! Building Superior Neural Architectures by Preserving N-D Structure
NdLinear: Don't Flatten! Building Superior Neural Architectures by Preserving N-D Structure
Alex Reneau
Jerry Yao-Chieh Hu
Zhongfang Zhuang
Ting-Chun Liu
Xiang He
Judah Goldfeder
Nadav Timor
Allen Roush
Ravid Shwartz-Ziv
HAI
108
0
0
21 Mar 2025
Empowering Time Series Analysis with Synthetic Data: A Survey and Outlook in the Era of Foundation Models
Xu Liu
Taha Aksu
Juncheng Liu
Qingsong Wen
Yuxuan Liang
Caiming Xiong
Siyang Song
Doyen Sahoo
Junnan Li
Chenghao Liu
AI4TS
92
1
0
14 Mar 2025
Unify and Anchor: A Context-Aware Transformer for Cross-Domain Time Series Forecasting
Xiaobin Hong
Jiawei Zhang
Wenzhong Li
Sanglu Lu
Jiajun Li
AI4TS
111
0
0
03 Mar 2025
Investigating Compositional Reasoning in Time Series Foundation Models
Willa Potosnak
Cristian Challu
Mononito Goswami
Kin G. Olivares
Michał Wiliński
Nina Żukowska
Artur Dubrawski
ReLMAI4TSLRM
134
2
0
09 Feb 2025
Time-MoE: Billion-Scale Time Series Foundation Models with Mixture of Experts
Time-MoE: Billion-Scale Time Series Foundation Models with Mixture of Experts
Xiaoming Shi
Shiyu Wang
Yuqi Nie
Dianqi Li
Zhou Ye
Qingsong Wen
Ming Jin
AI4TS
191
56
0
24 Sep 2024
Time-FFM: Towards LM-Empowered Federated Foundation Model for Time
  Series Forecasting
Time-FFM: Towards LM-Empowered Federated Foundation Model for Time Series Forecasting
Qingxiang Liu
Xu Liu
Chenghao Liu
Qingsong Wen
Yuxuan Liang
AI4TSAI4CE
109
11
0
23 May 2024
1