ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2410.05500
36
1

Residual Kolmogorov-Arnold Network for Enhanced Deep Learning

7 October 2024
Ray Congrui Yu
Sherry Wu
Jiang Gui
ArXivPDFHTML
Abstract

Despite their immense success, deep neural networks (CNNs) are costly to train, while modern architectures can retain hundreds of convolutional layers in network depth. Standard convolutional operations are fundamentally limited by their linear nature along with fixed activations, where multiple layers are needed to learn complex patterns, making this approach computationally inefficient and prone to optimization difficulties. As a result, we introduce RKAN (Residual Kolmogorov-Arnold Network), which could be easily implemented into stages of traditional networks, such as ResNet. The module also integrates polynomial feature transformation that provides the expressive power of many convolutional layers through learnable, non-linear feature refinement. Our proposed RKAN module offers consistent improvements over the base models on various well-known benchmark datasets, such as CIFAR-100, Food-101, and ImageNet.

View on arXiv
@article{yu2025_2410.05500,
  title={ Residual Kolmogorov-Arnold Network for Enhanced Deep Learning },
  author={ Ray Congrui Yu and Sherry Wu and Jiang Gui },
  journal={arXiv preprint arXiv:2410.05500},
  year={ 2025 }
}
Comments on this paper