Analyticity, Convergence and Convergence Rate of Recursive Maximum Likelihood Estimation in Hidden Markov Models

This paper considers the asymptotic properties of the recursive maximum likelihood estimation in hidden Markov models. The paper is focused on the asymptotic behavior of the log-likelihood function and on the point-convergence and convergence rate of the recursive maximum likelihood estimator. Using the principle of analytical continuation, the analyticity of the asymptotic log-likelihood function is shown for analytically parameterized hidden Markov models. Relying on this fact and some results from differential geometry (Lojasiewicz inequality), the almost sure point-convergence of the recursive maximum likelihood algorithm is demonstrated, and relatively tight bounds on the convergence rate are derived. As opposed to the existing result on the asymptotic behavior of maximum likelihood estimation in hidden Markov models, the results of this paper are obtained without assuming that the log-likelihood function has an isolated maximum at which the Hessian is strictly negative definite.
View on arXiv