16
0

An Adaptive Method Stabilizing Activations for Enhanced Generalization

Abstract

We introduce AdaAct, a novel optimization algorithm that adjusts learning rates according to activation variance. Our method enhances the stability of neuron outputs by incorporating neuron-wise adaptivity during the training process, which subsequently leads to better generalization -- a complementary approach to conventional activation regularization methods. Experimental results demonstrate AdaAct's competitive performance across standard image classification benchmarks. We evaluate AdaAct on CIFAR and ImageNet, comparing it with other state-of-the-art methods. Importantly, AdaAct effectively bridges the gap between the convergence speed of Adam and the strong generalization capabilities of SGD, all while maintaining competitive execution times. Code is available atthis https URL.

View on arXiv
@article{seung2025_2506.08353,
  title={ An Adaptive Method Stabilizing Activations for Enhanced Generalization },
  author={ Hyunseok Seung and Jaewoo Lee and Hyunsuk Ko },
  journal={arXiv preprint arXiv:2506.08353},
  year={ 2025 }
}
Comments on this paper