2
0

Embedding principle of homogeneous neural network for classification problem

Abstract

Understanding the convergence points and optimization landscape of neural networks is crucial, particularly for homogeneous networks where Karush-Kuhn-Tucker (KKT) points of the associated maximum-margin problem often characterize solutions. This paper investigates the relationship between such KKT points across networks of different widths generated via neuron splitting. We introduce and formalize the \textbf{KKT point embedding principle}, establishing that KKT points of a homogeneous network's max-margin problem (PΦP_{\Phi}) can be embedded into the KKT points of a larger network's problem (PΦ~P_{\tilde{\Phi}}) via specific linear isometric transformations corresponding to neuron splitting. We rigorously prove this principle holds for neuron splitting in both two-layer and deep homogeneous networks. Furthermore, we connect this static embedding to the dynamics of gradient flow training with smooth losses. We demonstrate that trajectories initiated from appropriately mapped points remain mapped throughout training and that the resulting ω\omega-limit sets of directions are correspondingly mapped (T(L(θ(0)))=L(η(0))T(L(\theta(0))) = L(\boldsymbol{\eta}(0))), thereby preserving the alignment with KKT directions dynamically when directional convergence occurs. Our findings offer insights into the effects of network width, parameter redundancy, and the structural connections between solutions found via optimization in homogeneous networks of varying sizes.

View on arXiv
@article{zhang2025_2505.12419,
  title={ Embedding principle of homogeneous neural network for classification problem },
  author={ Jiahan Zhang and Tao Luo and Yaoyu Zhang },
  journal={arXiv preprint arXiv:2505.12419},
  year={ 2025 }
}
Comments on this paper