ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2506.11170
36
0

PromptTSS: A Prompting-Based Approach for Interactive Multi-Granularity Time Series Segmentation

12 June 2025
Ching Chang
Ming-Chih Lo
Wen-Chih Peng
Tien-Fu Chen
    AI4TS
ArXiv (abs)PDFHTML
Main:8 Pages
7 Figures
Bibliography:2 Pages
4 Tables
Abstract

Multivariate time series data, collected across various fields such as manufacturing and wearable technology, exhibit states at multiple levels of granularity, from coarse-grained system behaviors to fine-grained, detailed events. Effectively segmenting and integrating states across these different granularities is crucial for tasks like predictive maintenance and performance optimization. However, existing time series segmentation methods face two key challenges: (1) the inability to handle multiple levels of granularity within a unified model, and (2) limited adaptability to new, evolving patterns in dynamic environments. To address these challenges, we propose PromptTSS, a novel framework for time series segmentation with multi-granularity states. PromptTSS uses a unified model with a prompting mechanism that leverages label and boundary information to guide segmentation, capturing both coarse- and fine-grained patterns while adapting dynamically to unseen patterns. Experiments show PromptTSS improves accuracy by 24.49% in multi-granularity segmentation, 17.88% in single-granularity segmentation, and up to 599.24% in transfer learning, demonstrating its adaptability to hierarchical states and evolving time series dynamics.

View on arXiv
@article{chang2025_2506.11170,
  title={ PromptTSS: A Prompting-Based Approach for Interactive Multi-Granularity Time Series Segmentation },
  author={ Ching Chang and Ming-Chih Lo and Wen-Chih Peng and Tien-Fu Chen },
  journal={arXiv preprint arXiv:2506.11170},
  year={ 2025 }
}
Comments on this paper