Temporal Variational Implicit Neural Representations

We introduce Temporal Variational Implicit Neural Representations (TV-INRs), a probabilistic framework for modeling irregular multivariate time series that enables efficient individualized imputation and forecasting. By integrating implicit neural representations with latent variable models, TV-INRs learn distributions over time-continuous generator functions conditioned on signal-specific covariates. Unlike existing approaches that require extensive training, fine-tuning or meta-learning, our method achieves accurate individualized predictions through a single forward pass. Our experiments demonstrate that with a single TV-INRs instance, we can accurately solve diverse imputation and forecasting tasks, offering a computationally efficient and scalable solution for real-world applications. TV-INRs excel especially in low-data regimes, where it outperforms existing methods by an order of magnitude in mean squared error for imputation task.
View on arXiv@article{koyuncu2025_2506.01544, title={ Temporal Variational Implicit Neural Representations }, author={ Batuhan Koyuncu and Rachael DeVries and Ole Winther and Isabel Valera }, journal={arXiv preprint arXiv:2506.01544}, year={ 2025 } }