Degrees of freedom for nonlinear least squares estimation

We give a general result on the effective degrees of freedom for nonlinear least squares estimation, which relates the degrees of freedom to the divergence of the estimator. We show that in a general framework, the divergence of the least squares estimator is a well defined but potentially negatively biased estimate of the degrees of freedom, and we give an exact representation of the bias. This implies that if we use the divergence as a plug-in estimate of the degrees of freedom in Stein's unbiased risk estimate (SURE), we generally underestimate the true risk. Our result applies, for instance, to model searching problems, yielding a finite sample characterization of how much the search contributes to the degrees of freedom. Motivated by the problem of fitting ODE models in systems biology, the general results are illustrated by the estimation of systems of linear ODEs. In this example the divergence turns out to be a useful estimate of degrees of freedom for -constrained models.
View on arXiv