Remote photoplethysmography (rPPG) technology infers heart rate by capturing subtle color changes in facial skinusing a camera, demonstrating great potential in non-contact heart rate measurement. However, measurementaccuracy significantly decreases in complex scenarios such as lighting changes and head movements comparedto ideal laboratory conditions. Existing deep learning models often neglect the quantification of measurementuncertainty, limiting their credibility in dynamic scenes. To address the issue of insufficient rPPG measurementreliability in complex scenarios, this paper introduces Bayesian neural networks to the rPPG field for the first time,proposing the Robust Fusion Bayesian Physiological Network (RF-BayesPhysNet), which can model both aleatoricand epistemic uncertainty. It leverages variational inference to balance accuracy and computational efficiency.Due to the current lack of uncertainty estimation metrics in the rPPG field, this paper also proposes a new set ofmethods, using Spearman correlation coefficient, prediction interval coverage, and confidence interval width, tomeasure the effectiveness of uncertainty estimation methods under different noise conditions. Experiments showthat the model, with only double the parameters compared to traditional network models, achieves a MAE of 2.56on the UBFC-RPPG dataset, surpassing most models. It demonstrates good uncertainty estimation capabilityin no-noise and low-noise conditions, providing prediction confidence and significantly enhancing robustness inreal-world applications. We have open-sourced the code atthis https URL
View on arXiv@article{ma2025_2504.03915, title={ RF-BayesPhysNet: A Bayesian rPPG Uncertainty Estimation Method for Complex Scenarios }, author={ Rufei Ma and Chao Chen }, journal={arXiv preprint arXiv:2504.03915}, year={ 2025 } }