ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2502.01922
  4. Cited By
LAST SToP For Modeling Asynchronous Time Series

LAST SToP For Modeling Asynchronous Time Series

4 February 2025
Shubham Gupta
Thibaut Durand
Graham Taylor
Lilian W. Białokozowicz
    AI4TS
ArXivPDFHTML

Papers citing "LAST SToP For Modeling Asynchronous Time Series"

14 / 14 papers shown
Title
Lag-Llama: Towards Foundation Models for Probabilistic Time Series
  Forecasting
Lag-Llama: Towards Foundation Models for Probabilistic Time Series Forecasting
Kashif Rasul
Arjun Ashok
Andrew Robert Williams
Hena Ghonia
Rishika Bhagwatkar
...
Nicolas Chapados
Alexandre Drouin
Valentina Zantedeschi
Yuriy Nevmyvaka
Irina Rish
AI4TS
BDL
66
47
0
12 Oct 2023
BLOOM: A 176B-Parameter Open-Access Multilingual Language Model
BLOOM: A 176B-Parameter Open-Access Multilingual Language Model
BigScience Workshop
:
Teven Le Scao
Angela Fan
Christopher Akiki
...
Zhongli Xie
Zifan Ye
M. Bras
Younes Belkada
Thomas Wolf
VLM
287
2,364
0
09 Nov 2022
Counterfactual Neural Temporal Point Process for Estimating Causal
  Influence of Misinformation on Social Media
Counterfactual Neural Temporal Point Process for Estimating Causal Influence of Misinformation on Social Media
Yizhou Zhang
Defu Cao
Yang Liu
CML
63
23
0
14 Oct 2022
PromptCast: A New Prompt-based Learning Paradigm for Time Series
  Forecasting
PromptCast: A New Prompt-based Learning Paradigm for Time Series Forecasting
Hao Xue
Flora D.Salim
AI4TS
76
144
0
20 Sep 2022
Matryoshka Representation Learning
Matryoshka Representation Learning
Aditya Kusupati
Gantavya Bhatt
Aniket Rege
Matthew Wallingford
Aditya Sinha
...
William Howard-Snyder
Kaifeng Chen
Sham Kakade
Prateek Jain
Ali Farhadi
69
83
0
26 May 2022
Transformer Embeddings of Irregularly Spaced Events and Their
  Participants
Transformer Embeddings of Irregularly Spaced Events and Their Participants
Chenghao Yang
Hongyuan Mei
Jason Eisner
AI4TS
72
60
0
31 Dec 2021
BEiT: BERT Pre-Training of Image Transformers
BEiT: BERT Pre-Training of Image Transformers
Hangbo Bao
Li Dong
Songhao Piao
Furu Wei
ViT
182
2,790
0
15 Jun 2021
The Power of Scale for Parameter-Efficient Prompt Tuning
The Power of Scale for Parameter-Efficient Prompt Tuning
Brian Lester
Rami Al-Rfou
Noah Constant
VPVLM
474
3,952
0
18 Apr 2021
Learning Transferable Visual Models From Natural Language Supervision
Learning Transferable Visual Models From Natural Language Supervision
Alec Radford
Jong Wook Kim
Chris Hallacy
Aditya A. Ramesh
Gabriel Goh
...
Amanda Askell
Pamela Mishkin
Jack Clark
Gretchen Krueger
Ilya Sutskever
CLIP
VLM
703
28,659
0
26 Feb 2021
Prefix-Tuning: Optimizing Continuous Prompts for Generation
Prefix-Tuning: Optimizing Continuous Prompts for Generation
Xiang Lisa Li
Percy Liang
182
4,209
0
01 Jan 2021
Language Models are Few-Shot Learners
Language Models are Few-Shot Learners
Tom B. Brown
Benjamin Mann
Nick Ryder
Melanie Subbiah
Jared Kaplan
...
Christopher Berner
Sam McCandlish
Alec Radford
Ilya Sutskever
Dario Amodei
BDL
529
41,106
0
28 May 2020
Transformer Hawkes Process
Transformer Hawkes Process
Simiao Zuo
Haoming Jiang
Zichong Li
T. Zhao
H. Zha
AI4TS
61
290
0
21 Feb 2020
Attention Is All You Need
Attention Is All You Need
Ashish Vaswani
Noam M. Shazeer
Niki Parmar
Jakob Uszkoreit
Llion Jones
Aidan Gomez
Lukasz Kaiser
Illia Polosukhin
3DV
501
129,831
0
12 Jun 2017
Deep Networks with Stochastic Depth
Deep Networks with Stochastic Depth
Gao Huang
Yu Sun
Zhuang Liu
Daniel Sedra
Kilian Q. Weinberger
158
2,344
0
30 Mar 2016
1