ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2504.00070
31
0

Enhancing Time Series Forecasting with Fuzzy Attention-Integrated Transformers

31 March 2025
Sanjay Chakraborty
Fredrik Heintz
    AI4TS
ArXivPDFHTML
Abstract

This paper introduces FANTF (Fuzzy Attention Network-Based Transformers), a novel approach that integrates fuzzy logic with existing transformer architectures to advance time series forecasting, classification, and anomaly detection tasks. FANTF leverages a proposed fuzzy attention mechanism incorporating fuzzy membership functions to handle uncertainty and imprecision in noisy and ambiguous time series data. The FANTF approach enhances its ability to capture complex temporal dependencies and multivariate relationships by embedding fuzzy logic principles into the self-attention module of the existing transformer's architecture. The framework combines fuzzy-enhanced attention with a set of benchmark existing transformer-based architectures to provide efficient predictions, classification and anomaly detection. Specifically, FANTF generates learnable fuzziness attention scores that highlight the relative importance of temporal features and data points, offering insights into its decision-making process. Experimental evaluatios on some real-world datasets reveal that FANTF significantly enhances the performance of forecasting, classification, and anomaly detection tasks over traditional transformer-based models.

View on arXiv
@article{chakraborty2025_2504.00070,
  title={ Enhancing Time Series Forecasting with Fuzzy Attention-Integrated Transformers },
  author={ Sanjay Chakraborty and Fredrik Heintz },
  journal={arXiv preprint arXiv:2504.00070},
  year={ 2025 }
}
Comments on this paper