ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2503.01195
62
0

PostHoc FREE Calibrating on Kolmogorov Arnold Networks

3 March 2025
Wenhao Liang
W. Zhang
Lin Yue
Miao Xu
Olaf Maennel
Weitong Chen
ArXivPDFHTML
Abstract

Kolmogorov Arnold Networks (KANs) are neural architectures inspired by the Kolmogorov Arnold representation theorem that leverage B Spline parameterizations for flexible, locally adaptive function approximation. Although KANs can capture complex nonlinearities beyond those modeled by standard MultiLayer Perceptrons (MLPs), they frequently exhibit miscalibrated confidence estimates manifesting as overconfidence in dense data regions and underconfidence in sparse areas. In this work, we systematically examine the impact of four critical hyperparameters including Layer Width, Grid Order, Shortcut Function, and Grid Range on the calibration of KANs. Furthermore, we introduce a novel TemperatureScaled Loss (TSL) that integrates a temperature parameter directly into the training objective, dynamically adjusting the predictive distribution during learning. Both theoretical analysis and extensive empirical evaluations on standard benchmarks demonstrate that TSL significantly reduces calibration errors, thereby improving the reliability of probabilistic predictions. Overall, our study provides actionable insights into the design of spline based neural networks and establishes TSL as a robust loss solution for enhancing calibration.

View on arXiv
@article{liang2025_2503.01195,
  title={ PostHoc FREE Calibrating on Kolmogorov Arnold Networks },
  author={ Wenhao Liang and Wei Emma Zhang and Lin Yue and Miao Xu and Olaf Maennel and Weitong Chen },
  journal={arXiv preprint arXiv:2503.01195},
  year={ 2025 }
}
Comments on this paper