ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2410.18477
27
0

Monge-Ampere Regularization for Learning Arbitrary Shapes from Point Clouds

24 October 2024
Chuanxiang Yang
Yuanfeng Zhou
Guangshun Wei
Long Ma
Junhui Hou
Yuan-Bin Liu
Wenping Wang
ArXivPDFHTML
Abstract

As commonly used implicit geometry representations, the signed distance function (SDF) is limited to modeling watertight shapes, while the unsigned distance function (UDF) is capable of representing various surfaces. However, its inherent theoretical shortcoming, i.e., the non-differentiability at the zero level set, would result in sub-optimal reconstruction quality. In this paper, we propose the scaled-squared distance function (S2^{2}2DF), a novel implicit surface representation for modeling arbitrary surface types. S2^{2}2DF does not distinguish between inside and outside regions while effectively addressing the non-differentiability issue of UDF at the zero level set. We demonstrate that S2^{2}2DF satisfies a second-order partial differential equation of Monge-Ampere-type, allowing us to develop a learning pipeline that leverages a novel Monge-Ampere regularization to directly learn S2^{2}2DF from raw unoriented point clouds without supervision from ground-truth S2^{2}2DF values. Extensive experiments across multiple datasets show that our method significantly outperforms state-of-the-art supervised approaches that require ground-truth surface information as supervision for training. The code will be publicly available atthis https URL.

View on arXiv
@article{yang2025_2410.18477,
  title={ Monge-Ampere Regularization for Learning Arbitrary Shapes from Point Clouds },
  author={ Chuanxiang Yang and Yuanfeng Zhou and Guangshun Wei and Long Ma and Junhui Hou and Yuan Liu and Wenping Wang },
  journal={arXiv preprint arXiv:2410.18477},
  year={ 2025 }
}
Comments on this paper