ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1704.00454
26
16

Clustering in Hilbert simplex geometry

3 April 2017
Frank Nielsen
Ke Sun
ArXivPDFHTML
Abstract

Clustering categorical distributions in the finite-dimensional probability simplex is a fundamental task met in many applications dealing with normalized histograms. Traditionally, the differential-geometric structures of the probability simplex have been used either by (i) setting the Riemannian metric tensor to the Fisher information matrix of the categorical distributions, or (ii) defining the dualistic information-geometric structure induced by a smooth dissimilarity measure, the Kullback-Leibler divergence. In this work, we introduce for clustering tasks a novel computationally-friendly framework for modeling geometrically the probability simplex: The {\em Hilbert simplex geometry}. In the Hilbert simplex geometry, the distance is the non-separable Hilbert's metric distance which satisfies the property of information monotonicity with distance level set functions described by polytope boundaries. We show that both the Aitchison and Hilbert simplex distances are norm distances on normalized logarithmic representations with respect to the ℓ2\ell_2ℓ2​ and variation norms, respectively. We discuss the pros and cons of those different statistical modelings, and benchmark experimentally these different kind of geometries for center-based kkk-means and kkk-center clustering. Furthermore, since a canonical Hilbert distance can be defined on any bounded convex subset of the Euclidean space, we also consider Hilbert's geometry of the elliptope of correlation matrices and study its clustering performances compared to Fr\"obenius and log-det divergences.

View on arXiv
Comments on this paper