ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2505.17475
24
0

PoseBH: Prototypical Multi-Dataset Training Beyond Human Pose Estimation

23 May 2025
U. Jeong
J. Freer
Seungryul Baek
H. Chang
K. Kim
    3DH
ArXivPDFHTML
Abstract

We study multi-dataset training (MDT) for pose estimation, where skeletal heterogeneity presents a unique challenge that existing methods have yet to address. In traditional domains, \eg regression and classification, MDT typically relies on dataset merging or multi-head supervision. However, the diversity of skeleton types and limited cross-dataset supervision complicate integration in pose estimation. To address these challenges, we introduce PoseBH, a new MDT framework that tackles keypoint heterogeneity and limited supervision through two key techniques. First, we propose nonparametric keypoint prototypes that learn within a unified embedding space, enabling seamless integration across skeleton types. Second, we develop a cross-type self-supervision mechanism that aligns keypoint predictions with keypoint embedding prototypes, providing supervision without relying on teacher-student models or additional augmentations. PoseBH substantially improves generalization across whole-body and animal pose datasets, including COCO-WholeBody, AP-10K, and APT-36K, while preserving performance on standard human pose benchmarks (COCO, MPII, and AIC). Furthermore, our learned keypoint embeddings transfer effectively to hand shape estimation (InterHand2.6M) and human body shape estimation (3DPW). The code for PoseBH is available at:this https URL.

View on arXiv
@article{jeong2025_2505.17475,
  title={ PoseBH: Prototypical Multi-Dataset Training Beyond Human Pose Estimation },
  author={ Uyoung Jeong and Jonathan Freer and Seungryul Baek and Hyung Jin Chang and Kwang In Kim },
  journal={arXiv preprint arXiv:2505.17475},
  year={ 2025 }
}
Comments on this paper