ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2107.11925
17
0

Tsallis and Rényi deformations linked via a new λλλ-duality

26 July 2021
Ting-Kam Leonard Wong
Jun Zhang
ArXivPDFHTML
Abstract

Tsallis and R\'{e}nyi entropies, which are monotone transformations of each other, are deformations of the celebrated Shannon entropy. Maximization of these deformed entropies, under suitable constraints, leads to the qqq-exponential family which has applications in non-extensive statistical physics, information theory and statistics. In previous information-geometric studies, the qqq-exponential family was analyzed using classical convex duality and Bregman divergence. In this paper, we show that a generalized λ\lambdaλ-duality, where λ=1−q\lambda = 1 - qλ=1−q is the constant information-geometric curvature, leads to a generalized exponential family which is essentially equivalent to the qqq-exponential family and has deep connections with R\'{e}nyi entropy and optimal transport. Using this generalized convex duality and its associated logarithmic divergence, we show that our λ\lambdaλ-exponential family satisfies properties that parallel and generalize those of the exponential family. Under our framework, the R\'{e}nyi entropy and divergence arise naturally, and we give a new proof of the Tsallis/R\'{e}nyi entropy maximizing property of the qqq-exponential family. We also introduce a λ\lambdaλ-mixture family which may be regarded as the dual of the λ\lambdaλ-exponential family, and connect it with other mixture-type families. Finally, we discuss a duality between the λ\lambdaλ-exponential family and the λ\lambdaλ-logarithmic divergence, and study its statistical consequences.

View on arXiv
Comments on this paper