ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2503.00068
36
0

PI-HMR: Towards Robust In-bed Temporal Human Shape Reconstruction with Contact Pressure Sensing

27 February 2025
Ziyu Wu
Yufan Xiong
Mengting Niu
Fangting Xie
Quan Wan
Qijun Ying
Boyan Liu
Xiaohui Cai
    3DH
ArXivPDFHTML
Abstract

Long-term in-bed monitoring benefits automatic and real-time health management within healthcare, and the advancement of human shape reconstruction technologies further enhances the representation and visualization of users' activity patterns. However, existing technologies are primarily based on visual cues, facing serious challenges in non-light-of-sight and privacy-sensitive in-bed scenes. Pressure-sensing bedsheets offer a promising solution for real-time motion reconstruction. Yet, limited exploration in model designs and data have hindered its further development. To tackle these issues, we propose a general framework that bridges gaps in data annotation and model design. Firstly, we introduce SMPLify-IB, an optimization method that overcomes the depth ambiguity issue in top-view scenarios through gravity constraints, enabling generating high-quality 3D human shape annotations for in-bed datasets. Then we present PI-HMR, a temporal-based human shape estimator to regress meshes from pressure sequences. By integrating multi-scale feature fusion with high-pressure distribution and spatial position priors, PI-HMR outperforms SOTA methods with 17.01mm Mean-Per-Joint-Error decrease. This work provides a whole

View on arXiv
@article{wu2025_2503.00068,
  title={ PI-HMR: Towards Robust In-bed Temporal Human Shape Reconstruction with Contact Pressure Sensing },
  author={ Ziyu Wu and Yufan Xiong and Mengting Niu and Fangting Xie and Quan Wan and Qijun Ying and Boyan Liu and Xiaohui Cai },
  journal={arXiv preprint arXiv:2503.00068},
  year={ 2025 }
}
Comments on this paper