Deterministic Neural SDEs for Affordable Uncertainty Quantification
- DiffM

Neural Stochastic Differential Equations (NSDEs) model the drift and diffusion functions of a stochastic process as neural networks. While NSDEs are known to predict time series accurately, their uncertainty quantification properties remain unexplored. We report the empirical finding that obtaining well-calibrated uncertainty estimations from NSDEs is computationally prohibitive. As a remedy, we develop a computationally affordable deterministic scheme for expressing the likelihood of a sequence, when dynamics is governed by a NSDE, which is applicable to both training and prediction. Our method introduces a bidirectional moment matching scheme, one vertical along the neural net layers, and one horizontal along the time direction, which benefits from an original combination of effective approximations. We observe in multiple experiments that the uncertainty calibration quality of our method can be matched by Monte Carlo sampling only after introducing at least five times more computation cost. Thanks to the numerical stability of deterministic training, our method also provides improvement in prediction accuracy.
View on arXiv