ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2405.15441
46
1

Statistical and Computational Guarantees of Kernel Max-Sliced Wasserstein Distances

24 May 2024
Jie Wang
M. Boedihardjo
Yao Xie
ArXivPDFHTML
Abstract

Optimal transport has been very successful for various machine learning tasks; however, it is known to suffer from the curse of dimensionality. Hence, dimensionality reduction is desirable when applied to high-dimensional data with low-dimensional structures. The kernel max-sliced (KMS) Wasserstein distance is developed for this purpose by finding an optimal nonlinear mapping that reduces data into 111 dimension before computing the Wasserstein distance. However, its theoretical properties have not yet been fully developed. In this paper, we provide sharp finite-sample guarantees under milder technical assumptions compared with state-of-the-art for the KMS ppp-Wasserstein distance between two empirical distributions with nnn samples for general p∈[1,∞)p\in[1,\infty)p∈[1,∞). Algorithm-wise, we show that computing the KMS 222-Wasserstein distance is NP-hard, and then we further propose a semidefinite relaxation (SDR) formulation (which can be solved efficiently in polynomial time) and provide a relaxation gap for the obtained solution. We provide numerical examples to demonstrate the good performance of our scheme for high-dimensional two-sample testing.

View on arXiv
@article{wang2025_2405.15441,
  title={ Statistical and Computational Guarantees of Kernel Max-Sliced Wasserstein Distances },
  author={ Jie Wang and March Boedihardjo and Yao Xie },
  journal={arXiv preprint arXiv:2405.15441},
  year={ 2025 }
}
Comments on this paper