ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2506.15369
12
0

Unsupervised Pelage Pattern Unwrapping for Animal Re-identification

18 June 2025
Aleksandr Algasov
Ekaterina A. Nepovinnykh
Fedor Zolotarev
T. Eerola
Heikki Kälviäinen
Pavel Zemčík
Charles V. Stewart
ArXiv (abs)PDFHTML
Main:5 Pages
4 Figures
Bibliography:1 Pages
2 Tables
Abstract

Existing individual re-identification methods often struggle with the deformable nature of animal fur or skin patterns which undergo geometric distortions due to body movement and posture changes. In this paper, we propose a geometry-aware texture mapping approach that unwarps pelage patterns, the unique markings found on an animal's skin or fur, into a canonical UV space, enabling more robust feature matching. Our method uses surface normal estimation to guide the unwrapping process while preserving the geometric consistency between the 3D surface and the 2D texture space. We focus on two challenging species: Saimaa ringed seals (Pusa hispida saimensis) and leopards (Panthera pardus). Both species have distinctive yet highly deformable fur patterns. By integrating our pattern-preserving UV mapping with existing re-identification techniques, we demonstrate improved accuracy across diverse poses and viewing angles. Our framework does not require ground truth UV annotations and can be trained in a self-supervised manner. Experiments on seal and leopard datasets show up to a 5.4% improvement in re-identification accuracy.

View on arXiv
@article{algasov2025_2506.15369,
  title={ Unsupervised Pelage Pattern Unwrapping for Animal Re-identification },
  author={ Aleksandr Algasov and Ekaterina Nepovinnykh and Fedor Zolotarev and Tuomas Eerola and Heikki Kälviäinen and Pavel Zemčík and Charles V. Stewart },
  journal={arXiv preprint arXiv:2506.15369},
  year={ 2025 }
}
Comments on this paper