24
6

Learning Optimal Filters Using Variational Inference

Abstract

Filtering - the task of estimating the conditional distribution for states of a dynamical system given partial and noisy observations - is important in many areas of science and engineering, including weather and climate prediction. However, the filtering distribution is generally intractable to obtain for high-dimensional, nonlinear systems. Filters used in practice, such as the ensemble Kalman filter (EnKF), provide biased probabilistic estimates for nonlinear systems and have numerous tuning parameters. Here, we present a framework for learning a parameterized analysis map - the transformation that takes samples from a forecast distribution, and combines with an observation, to update the approximate filtering distribution - using variational inference. In principle this can lead to a better approximation of the filtering distribution, and hence smaller bias. We show that this methodology can be used to learn the gain matrix, in an affine analysis map, for filtering linear and nonlinear dynamical systems; we also study the learning of inflation and localization parameters for an EnKF. The framework developed here can also be used to learn new filtering algorithms with more general forms for the analysis map.

View on arXiv
@article{bach2025_2406.18066,
  title={ Learning Optimal Filters Using Variational Inference },
  author={ Eviatar Bach and Ricardo Baptista and Enoch Luk and Andrew Stuart },
  journal={arXiv preprint arXiv:2406.18066},
  year={ 2025 }
}
Comments on this paper