106
0

Efficient Training of Physics-enhanced Neural ODEs via Direct Collocation and Nonlinear Programming

Main:13 Pages
10 Figures
Bibliography:3 Pages
4 Tables
Abstract

We propose a novel approach for training Physics-enhanced Neural ODEs (PeNODEs) by expressing the training process as a dynamic optimization problem. The full model, including neural components, is discretized using a high-order implicit Runge-Kutta method with flipped Legendre-Gauss-Radau points, resulting in a large-scale nonlinear program (NLP) efficiently solved by state-of-the-art NLP solvers such as Ipopt. This formulation enables simultaneous optimization of network parameters and state trajectories, addressing key limitations of ODE solver-based training in terms of stability, runtime, and accuracy. Extending on a recent direct collocation-based method for Neural ODEs, we generalize to PeNODEs, incorporate physical constraints, and present a custom, parallelized, open-source implementation. Benchmarks on a Quarter Vehicle Model and a Van-der-Pol oscillator demonstrate superior accuracy, speed, and generalization with smaller networks compared to other training techniques. We also outline a planned integration into OpenModelica to enable accessible training of Neural DAEs.

View on arXiv
@article{langenkamp2025_2505.03552,
  title={ Efficient Training of Physics-enhanced Neural ODEs via Direct Collocation and Nonlinear Programming },
  author={ Linus Langenkamp and Philip Hannebohm and Bernhard Bachmann },
  journal={arXiv preprint arXiv:2505.03552},
  year={ 2025 }
}
Comments on this paper

We use cookies and other tracking technologies to improve your browsing experience on our website, to show you personalized content and targeted ads, to analyze our website traffic, and to understand where our visitors are coming from. See our policy.